Insightcast AI
Home
© 2025 All rights reserved
Impressum
#340 – Chris Tarbell: FBI Agent Who Took Down Silk Road

#340 – Chris Tarbell: FBI Agent Who Took Down Silk Road

Lex Fridman Podcast XX

--:--
--:--

Full Transcription:

[0] The following is a conversation with Chris Tarbell, a former FBI special agent and cybercrime specialist who tracked down and arrested Russ Albrecht, the leader of Silk Road, the billion -dollar drug marketplace, and he tracked down and arrested Hector Monsaguer, a .k .a. Sabu, of Lulsec and Anonymous, which are some of the most influential hacker groups in history.

[1] He is co -founder of Naxo, a complex cybercrime investigation firm, and, is a co -host of a podcast called The Hacker and the Fed. This conversation gives the perspective of the FBI cybercrime investigator, both the technical and the human story.

[2] I would also like to interview people on the other side, the cybercriminals who have been caught, and perhaps the cyber criminals who have not been caught, and are still out there.

[3] And now, a quick few second mention of each sponsor.

[4] Check them out in the description.

[5] It's the best way to support this.

[6] podcast.

[7] We got True Classic T's for shirts, InsideTracker for biomonering, ExpressVPN for privacy, better help for mental health and Blinkist for non -fiction.

[8] Choose wisely, my friends.

[9] And now onto the full ad reads, as always, no ads in the middle.

[10] I try to make this interesting, but if you skip them, please still check out our sponsors.

[11] I enjoy their stuff.

[12] Maybe you will too.

[13] This show is brought to you by True Classic T's.

[14] High Quality.

[15] soft, slim -fitted t -shirts for men.

[16] They also make other men's wear staples like polos, work -out shirts, and boxers.

[17] But I have a lot of their black t -shirts.

[18] That's my main go -to.

[19] I'm not exactly sure why.

[20] But there's a certain kind of comfort in having a great t -shirt that all looks the same, and having many of them.

[21] So it removes that extra little decision in your life.

[22] So you can liberate your mind to focus on the more difficult decisions.

[23] in your life.

[24] So it's just this reliable thing I can count on.

[25] Either I wear a suit or I wear a true classic t -shirt.

[26] That's it.

[27] That's all I need to worry about.

[28] Life is simple.

[29] And there's a kind of minimalist aesthetic to a black t -shirt that just brings out the best in me, makes my soul sing.

[30] I think it's also in part a programmer, aesthetic, engineer aesthetic.

[31] I'm not exactly sure.

[32] But I do know that a lot of programmers I hang out with often wear black t -shirts.

[33] So I'm not sure what that's about.

[34] That could also just be in general a guy thing.

[35] I'm going to have to get some data on that.

[36] Anyway, go to trueclassic .com and enter code Lex to get 25 % off.

[37] This show is also brought to you by Inside Tracker, a service I use to track biological data.

[38] Your lifestyle decisions should be made based on data coming from your own body.

[39] I can't wait until the day that we have high bandwidth signal coming from the body at a frequency that is exceptionally high.

[40] So we have this short -term and long -term data about what's going on inside our body.

[41] Just raw data.

[42] So machine learning algorithms can just interpret that data to make decisions based on.

[43] I mean, to me, that's such an exciting world of creating systems that are able to truly listen to our body.

[44] There's experiences I have by going to do.

[45] doctors, I think the job of a doctor is so difficult.

[46] They get just few little inklings into the symptoms you provide.

[47] There's some data that can collect.

[48] They can do MRIs and all that kind of scans.

[49] It's not a high -resolution picture of what's going on in your body.

[50] Now, if you're the average case for a particular condition or disease or a particular issue you're having in your life, yeah, fine.

[51] But a lot of us are not the perfectly representative average case.

[52] In fact, most humans aren't.

[53] And so it makes sense that we should be looking at that specific person to make decisions for that specific person.

[54] Anyway, get special savings for a limited time when you go to inside tracker .com slash flex.

[55] This show is brought to you by ExpressVPN.

[56] I use them to protect my privacy on the internet.

[57] This conversation talks a lot about tour, which is a super extreme way to protect your privacy on the internet.

[58] Now, that's like advanced stuff.

[59] The basic stuff that everybody should be doing is a VPN.

[60] Everybody.

[61] And my favorite VPN, long, long, long before they were a sponsor, has been ExpressVPN.

[62] Big sexy button.

[63] It just works.

[64] It's super fast.

[65] Any operating system, including Linux, whatever your favorite flavor of Linux is, and I've tried them all.

[66] I like all of the flavors.

[67] That's actually factually incorrect.

[68] correct because I love all the flavors of Linux that I've tried, but there's a huge amount of them.

[69] I think there's a website called Distro Watch that looks at the popularity based on how often they're searched, I think, of different distributions of Linux.

[70] It's kind of cool to see all the different flavors.

[71] It's really exciting how active the community is in the development of those flavors.

[72] Anyway, go to expressvpn .com slash legspod for an extra three months free.

[73] This episode is sponsored by BetterHelp, spelled H -E -L -P -Help.

[74] I think there's a lot of ways in which social media reveals the mental instabilities that we have.

[75] That's sort of the roller coaster of life.

[76] And it's easy to lose yourself on that and not seek balance and a deep exploration of your mind beyond that kind of shallow roller coaster.

[77] Now, I'm a huge believer of talk therapy as a way to do that kind of serious exploration.

[78] However, you do that.

[79] And I think the great thing about BetterHelp is, it's super easy to do that.

[80] It makes it accessible to try.

[81] You get access to a licensed professional really quickly.

[82] Your mind is the most precious thing you have, so make sure you take care of it.

[83] It's easy, private, affordable, available anywhere.

[84] You can check it out at BetterHelp .com slash Lex.

[85] save on your first month.

[86] This show is also brought to you by Blinkist, my favorite app for learning new things.

[87] Blinkist takes key ideas from thousands of nonfiction books and condenses them down into 15 minutes that you can read or listen to.

[88] There's actually AI systems that I've recently been seeing pop up that do summarization.

[89] And let me tell you something.

[90] While that's nice and everything, they do not do nearly as good of a job as humans do.

[91] especially when those humans are the sort of world -class humans, whoever they are, behind Blinkist.

[92] There's really an extra level, an extra depth of insight that Blinkett is able to do for non -fiction books.

[93] It's not just that it's brief.

[94] It's also somehow reveals something new.

[95] Even for books I've read, it's revisiting the summaries gives me a new perspective in that book.

[96] I don't know.

[97] It's really, really powerful.

[98] So I recommend it not just for books you haven't read, but also for books you have read.

[99] And it includes basically all the major nonfiction books you can think of.

[100] You can claim a special offer for savings if you visit blinkist .com slash Lex.

[101] This is a Lex Friedman podcast.

[102] To support it, please check out our sponsors in the description.

[103] And now, dear friends, here's Chris Tarbell.

[104] You are one of the most successful cybersecurity law enforcement agents.

[105] of all time.

[106] You tracked and brought down Ross Albrecht, aka Dred Pirate Roberts, who ran Silk Road, and Sabu of Lulsec, and Anonymous, who was one of the most influential hackers in the world.

[107] So first, can you tell me the story of tracking down Ross Albrecht and Silk Road?

[108] Let's start from the very beginning.

[109] And maybe let's start by explaining what is the Silk Road.

[110] It was really the first dark market website.

[111] You literally could buy.

[112] anything there.

[113] Well, I'll take that back.

[114] There's two things you couldn't buy there.

[115] You couldn't buy guns because that was a different website.

[116] And you couldn't buy fake degrees.

[117] So no one could become a doctor.

[118] But you could buy literally whatever else you wanted.

[119] You could host things, drugs.

[120] You could buy heroin right from Afghanistan, the good stuff.

[121] Hacking tools.

[122] You could hack for hire.

[123] You could buy murders for hire if you wanted someone killed.

[124] Now, so when I was an FBI agent, I had to kind of sell some of these cases.

[125] And this was a big drug case.

[126] You know, that's the way people saw Silk Road.

[127] So internally to the FBI, how I had to sell it, I had to find the worst thing on there that could possibly find.

[128] And I think one time I saw a posting for baby parts.

[129] So let's say that you had a young child and that needed a liver.

[130] You could literally go on there and ask for a six -month -old liver if you wanted to.

[131] For like surgical operations versus something darker.

[132] Yeah, I never saw anything that dark as far as people they wanted to e -body parts.

[133] I did interview a cannibal once when I was in the FBI.

[134] That's another crazy story, but that one actually weirded me out.

[135] So I just watched Jeffrey Dahmer documentary on Netflix, and it just changed the way I see human beings, because it's a portrayal of a normal -looking person doing really dark things and doing so not out of a place of insanity, seemingly, but just because he has almost like a fetish for that kind of thing.

[136] It's disturbing that people like that are out there.

[137] So people like that would then be using Silk Road.

[138] Not like that necessarily, but people of different walks of life would be using Silk Road to.

[139] What was the primary thing?

[140] Drugs.

[141] It was primarily drugs.

[142] And that's where it started.

[143] It started off with Ross Albrecht growing mushrooms out in the wilderness of California and selling them.

[144] But really, his was more of a libertarian viewpoint.

[145] I mean, it was like you choose what you want to do for yourself and do it.

[146] And the way Silk Road kind of had the anonymity is it used what's called Tor, the onion router, which is an anonymizing function on the deep web.

[147] It was actually invented by the U .S. Navy back in the mid -90s or so.

[148] But it also used cryptocurrency.

[149] So it was the first time, like, we saw this birth on the Internet mixing cryptocurrency and an IP blocking software.

[150] So, you know, in cybercrime, you go after one the IP address and trace it through the network.

[151] or two you go after the cash and this one kind of blocked both cash meaning the flow of money physical or digital and then IP is some kind of identifying thing of the computer it's your telephone number for on your computer so yeah all computers have you know a unique four octet numbers you know it's so one two three dot one two three dot one two three and you know the computer uses DNS their domain name server to render that name.

[152] So if you were looking for, you know, CNN .com, your computer then translates that to that IP address, that telephone number where it can find that information.

[153] Didn't Silk Road used to have guns in the beginning?

[154] Or was that considered to have guns?

[155] Or was it naturally emerge?

[156] And then Ross realized, like, this is not good?

[157] It went back and forth.

[158] I think there were guns on there.

[159] And he tried to police it.

[160] You know, he told himself that the captain of the boat, so you had to follow his rules.

[161] So, you know, I think he took off those posts eventually and moved guns elsewhere.

[162] What was the system of censorship that he used, of selecting what is okay and not okay?

[163] I mean, it's...

[164] Him alone.

[165] He's the captain of the boat.

[166] Do you know by chance if there was a lot of debates and criticisms internally amongst the criminals of what is and isn't allowed?

[167] I mean, it's interesting to see a totally different moral code emerge that's outside the legal code of society.

[168] We did get the server and was able to read all of the chat logs that happened.

[169] I mean, all the records were there.

[170] I don't remember big debates.

[171] I mean, there was a clear leadership.

[172] And that was the final decision.

[173] That was the CEO of Silk Road.

[174] And so primarily it was drugs and primarily out of an ideology of freedom, which is if you want to use drugs, you should be able to use drugs.

[175] You should put in your body what you want to put in your body.

[176] And when you were presenting a case of why this should be investigated, you're trying to find, as you mentioned, the worst possible things on there?

[177] Is that what you were saying?

[178] So we had arrested a guy named Jeremy Hammond, and he hit himself.

[179] He was a hacker, and he would be arrested him.

[180] It was the second time he had been arrested for hacking.

[181] He used Tor.

[182] And so that kind of brought us to a point.

[183] The FBI has a computer system where you look up things, you know, you look up anything.

[184] I could look up your name or whatever of your.

[185] associated with my case.

[186] And we were finding at the time a lot of things in, you look it up, a case would end, be like, oh, this is Tor.

[187] And just stopped.

[188] Like, we couldn't get any further.

[189] So, you know, we had just had this bigger rest of Sabu and took down Anonymous.

[190] And sometimes in the FBI, the way it used to the old school FBI, when you had a big case and you're working seven days a week and 14 hours, 15 hours a day, you sort of take a break.

[191] The boss kind of said, But yeah, I'll see in a few months.

[192] Go get to know your family a little bit, you know, and come back.

[193] But the group of guys I was with was like, let's find the next big challenge.

[194] And that's when we were finding, you know, case closed, it was tour.

[195] Case closed, it was tour.

[196] So I said, let's take a look at tour and let's see what we can do.

[197] Maybe we'll take a different approach.

[198] And Silk Road was being looked at by other law enforcement, but it was taking like a drug approach where I'm going to find a drug buyer who got, you know, the drug sent to them in the mail, and let's arrest up.

[199] Let's go up the chain.

[200] But the buyers didn't know their dealer.

[201] They never met them.

[202] And so you were taking a cyber security approach.

[203] Yeah, we said, let's try to look at this from a cyber approach and see if we can gleam anything out of it.

[204] So I'm actually indirectly connected to, I'm sure I'm not admitting anything that's not already on my FBI file.

[205] Oh, I can already tell you what you're going to tell me, though.

[206] What's that?

[207] That when you were at college, you wrote a paper and you're connected to the person that started.

[208] You son of a bitch.

[209] You clever son of a bitch.

[210] FBI agent or a former FBI agent.

[211] Well, how would I not have known that?

[212] No, but I could have told you other stuff.

[213] No, that's exactly what you were about to tell me. I was looking up his name because I forgot it.

[214] So one of my advisors for my PhD was Rachel Greenstatt, and she is married to Roger Dingledyne, which is the co -founder of the tour project.

[215] And I actually reached out to him last night to do a hotel podcast together.

[216] I don't know.

[217] No, it was a good, it was a good party trick.

[218] I mean, it was just cool that you know this and the timing of it, it was just like beautiful.

[219] But just to link around the on the tour project, so we understand.

[220] So Tor is this black box that people disappear in in terms of like the when you were tracking people.

[221] Can you paint a picture of what tour is used in general?

[222] other it's like when you talk about bitcoin for example cryptocurrency especially today much more people use it for legal activity versus illegal activity what about tour tour was originally invented by the u .s navy so that like spies inside countries could talk to spies and no one could find them there was no way of tracing them and then they released that information free to the world so tour has two different versions of versions two different ways it can be utilized there's dot onion sites, which is like a normal website, a dot com, but it's only found within the Tor browser.

[223] You can only get there if you know the whole address and get there.

[224] The other way Tor is used is to go through the internet and then come out the other side if you want a different IP address, if you're trying to hide your identity.

[225] So if you were doing like, say, cybercrime, I would have the victim computer and I would trace it back out to a Tor relay.

[226] And then because you don't have an active connection or what's called a circuit at the time, I wouldn't be able to trace it back.

[227] But even if you hadn't, active circuit.

[228] I would have to go to each machine physically alive and try to rebuild that, which is literally impossible.

[229] So what do you feel about to her, ethically, philosophically, as a human being on this world that spent quite a few years of your life and still trying to protect people?

[230] So part of my time in the FBI was working on child exploitation, kitty porn, as I call it, that really changed my life in a way.

[231] And so anything that helps facilitate the exploitation of children fucking pisses me off and I I and that sort of jaded my opinion towards towards toward because that because it it helps facilitate those sites so this ideal of freedom that Russell Albreck for example tried to embody is something that you don't connect with anymore because of what you've seen that ideal being used for I mean the child exploitation is a specific example for it you know and it's easy for me to sit here and say child exploitation child porn because no one listening to this is ever going to say that i'm wrong and that we should allow child porn um should because some people utilize it in a bad way should it go away um no i mean i'm a technologist i want technology to move forward um you know people are going to do bad things and they're going to use technology to help them do bad things well let me ask you then oh we'll jump around a little bit, but the things you were able to do in tracking down information, and we'll get to it, there is some suspicion that this was only possible with mass surveillance, like with NSA, for example.

[232] First of all, is there any truth to that?

[233] And second of all, what do you feel are the pros and cons of mass surveillance?

[234] There is no truth to that.

[235] And then my feelings on mass surveillance.

[236] If there was, would you tell me?

[237] Probably not.

[238] I love this conversation so much.

[239] But what do you feel about the given that you said child porn?

[240] What are the pros and cons of surveillance at a society level?

[241] I mean, nobody wants to give up their privacy.

[242] I say that.

[243] I say no one wants to give up their privacy.

[244] But, I mean, I used to have to get a search warrant to look inside your house.

[245] Or I can just log onto your Facebook and you've got pictures of all inside your house and what's going on.

[246] I mean, it's not, you know, so people like the idea of not giving up their privacy, but they do it anyways.

[247] They're giving away their freedoms all the time.

[248] They're carrying watches that gives out their heartbeat a weight of companies that are storing that.

[249] I mean, what's more personal than your heartbeat?

[250] So I think people en masse really want to protect their privacy, and I would say most people don't really need to protect their privacy.

[251] But the case against mass surveillance is that if you want to, criticize the government in a very difficult time.

[252] You should be able to do it.

[253] So when you need the freedom, you should have it.

[254] So when you wake up one day and realize there's something going wrong with the country I love, I want to be able to help.

[255] One of the great things about the United States of America is there's that individual revolutionary spirit, so that the government doesn't become too powerful.

[256] You can always protest.

[257] There's always the best of the ideal of freedom of speech.

[258] You can always say, fuck you to the man. And I think there's a concern of direct or indirect suppression of that through mass surveillance.

[259] You might not, is that little subtle fear that grows with time.

[260] That, you know, why bother criticizing the government?

[261] It's going to be a headache.

[262] I'm going to get a ticket every time I say something bad.

[263] That kind of the thing.

[264] So it can get out of hand.

[265] The bureaucracy grows and the freedom slip away.

[266] That's the criticism, right?

[267] I completely see your point.

[268] I agree with it.

[269] I mean, but I mean, on the other side, people criticize the government of these freedoms, but I mean, tech companies talk about destroying your privacy and controlling what you can say.

[270] I realize they're private platforms and you, they can decide what's on their platform.

[271] But, you know, they're taking away your freedoms of what you can say.

[272] And we've heard some things where maybe government officials were in line with tech companies to take away some of that freedom.

[273] And I agree with you.

[274] That gets scary.

[275] Yeah, there's something about government that feels maybe because of the history of human civilization, maybe because tech companies are a new thing.

[276] But just knowing the history of abuses of government, there's something about government that enables the corrupting nature of power to take hold.

[277] scale more than tech companies, at least what we've seen so far.

[278] I agree, I agree.

[279] But, I mean, we haven't had a voice like we've had until recently.

[280] I mean, anyone that has a Twitter account now can speak and become a news article.

[281] You know, my parents didn't have that, didn't have that voice.

[282] If they wanted to speak out against the government or do something, they had to go to a protest or organize a protest or, you know, do something along those lines.

[283] So, you know, we have more of a place to put our voice out now.

[284] Yeah, it's incredible.

[285] But that's why it hurts, and that's why you notice it when certain voices get removed.

[286] The president of the United States of America was removed from one such or all such platforms.

[287] And that hurts.

[288] Yeah, that's crazy to me. That's insane.

[289] That's insane that we took that away.

[290] But let's return to Silk Road and Ross Albrecht.

[291] So how did your path with this very difficult, very fascinating case?

[292] case cross.

[293] We were looking to open a case against Tor because it was a problem.

[294] All the cases were closing because Tor.

[295] So we went on tour and we came up with 26 web different onion, dot onions that we targeted.

[296] We were looking for nexus to hacking because I was on a squad called CY2 and we were like the premier squad in New York that was working criminal cyber intrusions.

[297] And so, you know, any website that was offering hackers for hire or hacking tools for free, you know, or paid services, you know, like now we're seeing ransomware as a paid service and fishing as a paid service.

[298] Anything that offered that.

[299] So we opened this case on, I think we called it, so you have to name cases.

[300] One of the fun thing in the FBI is when you start a case, you get to name it.

[301] You would not believe how much time is spent in coming up with the name.

[302] You know, Casey goes by, I think we called this onion peeler because of the, yeah.

[303] So a little bit of humor, a little bit of wit, and some profundity to the language.

[304] Yeah, yeah.

[305] Because you're going to have to work with this for quite a lot.

[306] Yeah, this one had the potential of being a big one, you know, because I think, I think Silk Road was like the sixth on the list for that case, but we all knew that was sort of the golden ring.

[307] If you could make the splash that that onion site was going down, then it would probably we get some publicity.

[308] And that's part of, you know, law enforcement, is getting some publicity out of it that, you know, that makes others think not to do it.

[309] I wish to say that Tor is the name of the project, the browser.

[310] What is the onion technology behind Tor?

[311] Let's say you want to go to a dot onion site.

[312] You'll put in the dot on you want to go to, and your computer will build communications with a Tor relay, which are all publicly available out there.

[313] But you'll encrypt it.

[314] You'll put a package around your data, and so it's encrypted, and so can't read it.

[315] It goes to that first relay.

[316] That first relay knows about you, and then knows about the next relay down the chain.

[317] And so it takes your data and then encrypts that on the outside and sends it to relay number two.

[318] Now, relay number two only knows about relay number one.

[319] It doesn't know who you are asking for this, and it goes through there, adding those layers on top, layers of encryption to get to where it is, that, and then even the onion service doesn't know, except for the relay it came from, who it's talking to.

[320] And so it peels back that, gives the information, puts another layer back on.

[321] And so it's layers like you're peeling an onion back of the different relays, and that encryption protects who the sender is and what information they're sending.

[322] The more layers there are, the more exponentially difficult it is to decrypt it.

[323] I mean, you get to a place where you don't have to have so many.

[324] layers because it doesn't matter anymore.

[325] It's mathematically impossible to decrypt it.

[326] But, you know, the more relays you have, the slower it is.

[327] I mean, that's one of the big drawbacks on TOR is how slow it operates.

[328] So how do you peel the onion?

[329] So what are the different methodologies for trying to get some information from a cybersecurity perspective on these operations like the Silk Road?

[330] It's very difficult.

[331] People have come up with different techniques.

[332] There's been techniques to put out in the news media about how they do it.

[333] Running, like, massive amounts of relays, and you're controlling those relays.

[334] I think I've been somebody tried that once.

[335] So there's a technical solution, and what about social engineering?

[336] What about trying to infiltrate the actual humans they're using the Silk Road and trying to get in that way?

[337] Yeah, I mean, I definitely could see the way of doing that.

[338] And in this case, in our takedown, we used that.

[339] There was one of my partners, Jared Daryagan.

[340] He was an HSI investigator, and he had worked his way up to be a system admin on the site.

[341] So that did gleam quite a bit of information because he was inside and talking to, you know, at that time, we really know it is DPR or Dreadpiret, Roberts.

[342] We didn't know who that was yet, but we had that open communication.

[343] You know, and one of the things, you know, the technical aspect, on that is there was a jabber server that's a communication type of communication server that was being used.

[344] And we knew that Ross had his jabber set to Pacific Time.

[345] So we had a pretty good idea what part of the country he was in.

[346] I mean, isn't that from DPR's perspective, from Russ's perspective, isn't that clumsy?

[347] He wasn't a, he wasn't a big computer guy.

[348] Do you notice that aspect of like the technical savvy of some of these guys doesn't seem to be quite, why weren't they good at this?

[349] Well, the real techie savvy ones, we don't arrest.

[350] We don't get to them.

[351] We don't find them.

[352] Shout out to the techie criminals.

[353] They're probably watching this.

[354] I mean, yeah, I mean, we're getting the low -hanging fruit.

[355] I mean, we're getting the ones that can be caught.

[356] I mean, you know, I'm sure we'll talk about it, but the anonymous case, there was a guy named AV unit.

[357] He still, I lose sleep over him because we didn't catch him.

[358] We caught everybody else.

[359] We didn't catch him.

[360] he's good though he pops up too once in a while on the internet and it pisses me off yeah what's his name again AV unit that's all I know is his AV unit AV unit yeah I got a funny story about about him and what people think he is can actually can we go on that brief tangent sure I love tangents well let me ask you since he's probably he or she do we know it's a he we have no idea okay that's another funny story about hackers the he she issue What's the funny story there?

[361] Well, one of the guys in Lulsec was a she, was a 17 -year -old girl.

[362] And my source in the case, the guy Sabu, that I arrested and part of it, you know, we sat side by side for nine months and then took down, you know, the case and all that.

[363] He was convinced she was a girl and he was in love with her almost at one point.

[364] It turns out to be a 35 -year -old guy living in England.

[365] Oh, so he was convinced it was a...

[366] Yes.

[367] He was absolutely convinced.

[368] Based on what exactly, by a linguistic, like human -based linguistic analysis?

[369] She, he, whatever, you know, Kayla, which ended up being like a modification of his sister's name, the real guy's sister's name, was so good at building the backstory.

[370] All these guys, and it's funny, like, these guys are part of a hacking crew.

[371] They social engineer the shit out of each other.

[372] Just to build if one of them ever gets caught, they'll convince the everybody else that, you know, They're a Brazilian, you know, ISP owner or something like that.

[373] And that's how I'm so powerful.

[374] Well, yeah, that social engineering aspect is part of living a life of cyber crime or cybersecurity on the offensive or defensive.

[375] So AV unit, can I ask you also just a tangent of a tangent first?

[376] That's my favorite tangent.

[377] Okay.

[378] Is it possible for me to have a podcast conversation with somebody who hasn't been caught yet?

[379] And because they have the conversation, they still won't be caught.

[380] And is that a good idea?

[381] Meaning, is there a safe way for a criminal to talk to the Mian podcast?

[382] I would think so.

[383] I would think that someone could, I mean, someone who has been living a double life for long enough where you think they're not a criminal?

[384] No, no, no. They would have to admit that they would say I am AV unit.

[385] Oh, you would want to have a conversation with AV unit.

[386] Yes.

[387] Is there a way?

[388] I'm just speaking from an FBI perspective, technically speaking.

[389] Because I, so let me explain my motivation.

[390] I think I would like to be able to talk to people from all walks of life and understanding criminals, understanding their mind, I think is very important.

[391] And I think there's fundamentally something different between a criminal who's still active versus one that's been caught.

[392] the mind just from observing it changes completely once you're caught.

[393] You have a big shift in your understanding of the world.

[394] I mean, I do have a question about the ethics of having such conversations, but first, technically, is it possible?

[395] If I was technically advising you, I would say, first off, don't advertise it.

[396] The fewer people you're going to tell that you're having this conversation with, the better.

[397] And yeah, you could, you do it in person?

[398] Are you doing it?

[399] In person would be amazing, yeah.

[400] But their face would not be shown.

[401] Face would not be shown.

[402] Yeah, I mean, you couldn't publish a show for a while.

[403] They'd have to put a lot of trust in you, that you are not going to, you're going to have to alter those tapes.

[404] I say tapes because it's old school.

[405] You know.

[406] It's a tape.

[407] Exactly.

[408] I'm sure a lot of people just said that.

[409] Like, oh, shit, this old guy just said tape.

[410] I heard of VHS, it was in 1800s, I think.

[411] but yeah yeah you could do it they'd have to have complete faith and trust in you that you destroy the originals after you've altered it what about if they don't have faith is there a way for them to attain security um so uh like for me to go through some kind of process where i meet them somewhere where i mean you're not going to do it without a bag over your head i don't know if that's the life you want to live i'm fine with a bag over my head that's going to get taken out of context.

[412] But I just, I think it's a worthy effort.

[413] It's a worthy, it's worthy to go through the hardship of that, to understand the mind of somebody.

[414] I think fundamentally, conversations are a different thing than the operation of law enforcement.

[415] Understanding the mind of a criminal, I think is really important.

[416] I don't know if you're going to have the honest conversation that you're looking for.

[417] I mean, it may sound honest, but it may not be the truth.

[418] I found most times when I was talking to criminals, it's lies mixed with half -truths.

[419] And you kind of, it's, if they're good, they can keep that story going for long enough.

[420] If they're not, you know, you kind of see the relief in them when you finally break that wall down.

[421] That's the job of an interviewer.

[422] If the interviewer is good, then perhaps not directly, but through the gaps, seeps out the truth of the human being.

[423] So not necessarily the details of how they do the operations and so on, but just who they are as a human being, what their motivations are, what their ethics are, how they see the world, what is good, what is evil, do they see themselves as good, what do they see their motivation as, do they have resentment, what do they think about love for the people within their small community, do they have resentment for the government or for other nations or for other people, do they have childhood issues that led to a different view of the world and others perhaps have, do they have certain fetishes like sexual and otherwise that led to the construction of the world, they might be able to reveal some deep flaws to the cybersecurity infrastructure of our world, not in detail, but philosophically speaking.

[424] They might have, I know you might say it's just a narrative, but they might have a kind of ethical concern for the well -being of the world, that they're essentially attacking the weakness of the cyber security infrastructure because they believe ultimately that would lead to a safer world.

[425] So the attacks will reveal the weaknesses.

[426] And if they're stealing a bunch of money, that's okay because that's going to enforce you to invest a lot more money in defending, yeah, defending things that actually matter, you know, nuclear warheads and all those kinds of things.

[427] I mean, I could see, you know, It's fascinating to explore the mind of a human being like that because I think it will help people understand.

[428] Now, of course, it's still a person that's creating a lot of suffering in the world, which is a problem.

[429] So do you think ethically it's a good thing to do?

[430] I don't.

[431] I mean, I feel like I have a fairly high ethical bar that I have to put myself on, and I don't think I have a problem with it.

[432] I would love to listen to it.

[433] okay great i mean not that i'm your ethical coacher here yeah but uh well that's interesting i mean so because i thought you would have become jaded and exhausted by the criminal um mind it's funny um you know i i'm you know fast forward in our story i'm very good friends with with hector montserie or the saboo the guy arrested um and he tells stories of what he did in his and I'm like, oh, that Hector, you know.

[434] But then I listened to your episode with Brett Johnson, and I was like, this guy stealing money from the U .S. government and welfare fraud and all this sort of things.

[435] It just pissed me off.

[436] And I don't know why I have that differentiation in my head.

[437] I don't know why I think one's just, oh, Hector will be Hector, and then this guy just pissed me off.

[438] Well, you didn't feel that way about Hector until you probably met him.

[439] Well, I didn't know Hector.

[440] I knew Sabu.

[441] So I hunted down Sabu, and I learned about Hector over those nine months.

[442] We'll talk about a little.

[443] Let's finish with, let's return tangent to back to the tangent.

[444] Oh, one tangent up.

[445] Who's AV unit?

[446] I don't know.

[447] Interesting.

[448] So he's at the core of anonymous.

[449] He's one of the critical people in anonymous.

[450] What is known about him?

[451] There's what's known in public and what was known because he sat with Hector.

[452] And he was.

[453] He was.

[454] He was.

[455] sort of like the set things up guy.

[456] Um, so if Losek had like their hackers, which was Sabu and Kayla, and they had their, uh, their, their media guy, this guy topiary, uh, he lived up in the northern end of England.

[457] And, uh, they had a few other guys, but, but AV unit was the guy that set up infrastructure.

[458] So if you need a VPN in Brazil or something like that to pop through, um, one of the first things Hector told me after we arrested him is that Havy Unit was a secret service agent.

[459] And I was like, oh, shit.

[460] Just because he kind of lived that lifestyle.

[461] He'd be around for a bunch of days and then all of a sudden gone for three weeks.

[462] And I trying to get more out of Hector, that early on in that relationship, you know, I'm sure he was a little bit guarded.

[463] Maybe trying to social engineer me. Maybe he wanted that, oh shit, there's law enforcement involved in this.

[464] And not to say, I mean, I was in over my head with that case, just the amount of work that was going on.

[465] So to track them all down, plus the 350 hacks that came in about just military institutions, you know, it was swimming in the deep end.

[466] So it was just at the end of the case, I looked back and I was like, oh, fuck, he be unit.

[467] I could have had them all.

[468] You know, maybe that's the perfectionist in me. Oh, man. Well, reach out somehow.

[469] I won't say how, right?

[470] We'll have to figure out.

[471] Would you have them on?

[472] Yeah.

[473] Oh, my God.

[474] Just let me know.

[475] Just talk shit about you the whole time.

[476] That's perfect.

[477] He probably doesn't even care about me. Well, now he will because there's a certain pleasure of a guy who's extremely good at his job, not catching another guy who's extremely good at his job.

[478] Obviously better.

[479] He got away.

[480] There you go.

[481] He's still eating at you.

[482] I love it.

[483] You or she.

[484] If I can meet that guy one day, he or she, that'd be great.

[485] I mean, I have no power.

[486] So, yes, Silk Road.

[487] Can you speak to the scale of this thing?

[488] Just for people who are not familiar, how big was it?

[489] Any other interesting things you understand about its operation when it was active?

[490] So it was when we finally got looking through the books and, you know, the numbers came out as about $1 .2 billion in sales.

[491] It's kind of hard with the fluctuation value of Bitcoin at the time to come up with a real number.

[492] So you kind of pick a daily average, you know, and go across.

[493] Most of the operation was done in Bitcoin.

[494] It's all done in Bitcoin.

[495] You couldn't.

[496] You had escrow accounts on, you know, you came in and you put money in an escrow account and, you know, the transaction wasn't done until the client got the drugs or whatever they had bought.

[497] And then the drug dealers had sent it in.

[498] There was some talk at the time that the cartel was starting to sell on there.

[499] So that started getting a little hairy there at the end.

[500] What was the understanding of the relationship between organized crime, like the cartels and this kind of more ad hoc new age market that is the Silk Road.

[501] I mean, it was all just chatter.

[502] It was just, you know, because like I said, Jared was in the inside.

[503] So we saw some of it from the admin sides.

[504] And Ross had a lot of private conversations with the different people that he advised him.

[505] But no one knew each other.

[506] I mean, the only thing that they knew with the admins had to send an ID.

[507] to Ross had to send a picture of their driver's license or passport, which I always found very strange, because if you are an admin on a site that sells fake IDs, why would you send your real ID, and then why would the guy running the site who profits from selling fake IDs believe that it was?

[508] But fast forward, tangent, they were all real IDs.

[509] All the IDs that we found on Ross's computer as the admins were the real people's IDs.

[510] What do you make of that?

[511] Just other clumsiness?

[512] Yeah, low -hanging fruit, I guess.

[513] I guess.

[514] guess that's what it is.

[515] I mean, I mean, I would have bought, I mean, even Ross bought fake IDs off the site.

[516] He had federal agents knock on his door.

[517] You know, and then he got a little cocky about it.

[518] The landscape, the dynamics of trust is fascinating here.

[519] So you trust certain ideas are like, who do you trust in that kind of market?

[520] What was your understanding of the network of trust?

[521] I mean, I don't think anyone trust anybody, you know?

[522] I mean, I think Ross had his advisors of trust, but outside of that, I mean, he required people to send their ideas.

[523] for their trust he you know people stole from him uh there was there's open cases of that um it's a criminal world you can't trust anybody what was his life like you think lonely can't be entrapped in something like that where you the whole world focus on that and you can't tell people what you do all day could he have walked away like someone else take over the site just shut down either one just you put putting your yourself in his shoes, the loneliness, the anxiety, the, just the growing immensity of it.

[524] So walk away with some kind of financial stability.

[525] I couldn't have made it past two days.

[526] I don't like loneliness.

[527] I mean, if my wife's away, I'd probably call her 10, 12 times a day.

[528] We just talk about things.

[529] You know, I just, you know, something crossed my mind.

[530] I want to talk about it.

[531] And I'm sure she.

[532] And you'd like to talk to her, like, honestly, about everything.

[533] So if you were running so crows, you would.

[534] You wouldn't be able to, like...

[535] Hopefully, I'd have a little protection.

[536] I'd only mention to her when we were in bed to have that marital connection.

[537] But who knows?

[538] I mean, she's going to question why the Ferrari is outside and things like that.

[539] Yeah.

[540] Well, I'm sure you can come up with something.

[541] Why didn't he walk away?

[542] It's another question of why don't criminals walk away in these situations?

[543] Well, I mean, I don't know.

[544] Every criminal mine, and some do.

[545] I mean, A .V. Unit walked away.

[546] I mean, not to go back to that son of a bitch.

[547] There's a theme to this.

[548] But, you know, Ross started counting his dollars.

[549] I mean, he really kept track of how much money he was making.

[550] And it started, you know, getting exponentially growth.

[551] I mean, if he would have stayed at it, he would have probably been one of the richest people in the world.

[552] And do you think he liked the actual money or the fact of the number growing?

[553] I mean, have you ever held a Bitcoin?

[554] Yeah.

[555] Oh, you have?

[556] Well, he never did.

[557] What do you mean?

[558] He can't hold it.

[559] It's not real.

[560] It's not kind of like I can give you a briefcase of Bitcoin.

[561] coin like you know or something like that he liked the idea of it growing he liked the idea i mean i think it started off as sharing this idea but then he really did turn to like i am the captain of this ship and that's what goes and he was making a lot of money and again my interactions with ross was about maybe five or six hours over a over a two -day period um i knew dpr because i read his words and all that.

[562] I didn't really know Ross.

[563] There was a journal found on his computer, and so it sort of kind of gave me a little inside.

[564] So I don't like to do a playbook for criminals, but I'll tell you right now, don't write things down.

[565] There was a big fad about people, like, remember kids going around shooting people with paintballs and filming it?

[566] I don't know why you would do that.

[567] Why would you videotape yourself committing crime and then publish it?

[568] Like, if there's one thing I've taught my children, don't record yourself doing bad things.

[569] It never goes back as well.

[570] So, give advice on the other end of logs being very useful for the defense perspective for you know if information is useful for being able to figure out what the attacks were all about logs are the only reason i found hector montsegore i mean the the one time his uh vpn dropped during a fox hack and he says he didn't it wasn't even hacking he just was sent a link and he clicked on it and in 10 million lines of uh of logs there was one IP address that stuck out This is fascinating.

[571] We'll explore several angles of that.

[572] So what was the process of bringing down Ross and the Silk Road?

[573] All right, so that's a long story.

[574] You want the whole thing, and you want to break it up.

[575] Let's start at the beginning.

[576] Once we had the information of the chat logs and all that from the server, we found...

[577] What's the chat log?

[578] So the Dot Onion was running the website, the Silk Road, was running.

[579] on a server in Iceland.

[580] How did you figure that out?

[581] That's one of the claims that the NSA.

[582] Yeah, that's the one that we said that, yeah, I wouldn't tell you if it was.

[583] It's on the internet.

[584] I mean, the internet has their conspiracy theories and all that, so.

[585] But you figure out, that's the part of the thing you do.

[586] It's puzzle pieces and you have to put them together and look for different pieces of information and figure out, okay, so you figure out the server is in Iceland.

[587] We get a copy of it, and so we start getting clues off of that.

[588] Was it a physical copy of the server?

[589] Yeah, you fly over there.

[590] So you go, if you've been to Iceland, if you've never been, you should definitely go to Iceland.

[591] Is it beautiful?

[592] I love it.

[593] I love it.

[594] It was what, so I'll tell you this.

[595] So, sorry, tangents.

[596] I love this.

[597] Yeah.

[598] So I went to Iceland for the anonymous case.

[599] Then I went to Iceland for the Silk Road case.

[600] And I was like, oh, shit, all cyber crime goes through Iceland.

[601] It was just my sort of thing.

[602] And I was over there for like the third time, and I said, if I ever can bring my family here.

[603] Like, so there's a place called Thingavar, and I'm sure I'm fucking up the name, the Icelandics are pissed right now.

[604] But it's where the North American continental plate and the European continental plate are pulling apart, and it's being filled in with volcanic material in the middle.

[605] And it's so cool.

[606] Like, I was like, one day, I'll be able afford to bring my family here.

[607] And once I left.

[608] Just like the humbling and the beauty of nature.

[609] Just everything, man. It was a different world.

[610] It was, it was insane how great Iceland is.

[611] And so we went back, and we rented a van, and we took friends, and we drove around the entire country.

[612] Absolutely, like, a beautiful place.

[613] Like, Reykivik's nice, but get out of Reykivik as quick as you can and see the countryside.

[614] How is this place even real?

[615] Well, it's so new.

[616] I mean, that's, so, you know, our rivers have been going through here for millions of years and flattened everything out and all that.

[617] These are new, this is new land being carved by these rivers.

[618] You can walk behind a waterfall in one place.

[619] It's the most beautiful place I've ever been.

[620] You understand why this is a place where a lot of hacking is being done?

[621] Because the energy is free and it's cool.

[622] So you have a lot of servers going on there.

[623] Server farms, the energy has come up out of the ground, geothermal.

[624] And then it keeps all the servers nice and cool.

[625] So why not keep your computers there at a cheap rate?

[626] I'll definitely visit for several reasons, including to talk to AV unit.

[627] Yeah, he'll watch you there.

[628] Well, the servers are there, but they don't probably live there.

[629] I mean, that's interesting.

[630] I mean, the Pacific, the PST of the Times Zone, there's so many fascinating things to explore here.

[631] But so you got - Sorry, to add to that.

[632] I mean, the European Internet cable goes through there, so, you know, across to Greenland and down through Canada and all that.

[633] So they have backbone access with cheap energy and free cold weather, you know.

[634] And beautiful.

[635] Oh, and beautiful, yes.

[636] So chat logs on that server, what was in the chat logs?

[637] Everything.

[638] He kept them all.

[639] That's another issue.

[640] If you're writing a criminal enterprise, please don't keep all.

[641] Again, I'm not making a guidebook of how to commit the perfect crime.

[642] But, you know, every chat he ever had, and everyone's chat, it was like going into Facebook of criminal activity.

[643] Yeah, I'm just looking at texts with Elon Musk being part of the conversations.

[644] I don't know if you're familiar, but they've been made public for the court case he's going through, what's going through, is going through, what's going through with Twitter.

[645] I don't know where it is.

[646] But it made me realize that, oh, okay, I'm generally, that's my philosophy on life, is like anything I text or email or say publicly or privately I should be proud of.

[647] So I try to kind of do that because you basically, you say don't keep chat logs, but it's very difficult.

[648] to erase chat logs from this world.

[649] I guess if you're a criminal, that should be, like, you have to be exceptionally competent at that kind of thing.

[650] To erase your footprints is very, very difficult.

[651] Can't make one mistake.

[652] All it takes is one mistake of keeping it.

[653] But yeah, I mean, not only do you have to be, whatever you put in a chat log or whatever put in an email, it has to hold up and you have to stand behind it publicly when it comes out, but if it comes out 10 years from now, you have to stand behind it.

[654] I mean, we're seeing that now in today, society.

[655] Yeah, but that's a responsibility you have to take really, really seriously.

[656] If I was a parent and advising teens, like, you kind of have to teach them that.

[657] I know there's a sense, like, no, we'll become more accustomed to that kind of thing, but in reality, no, I think in the future will still be held responsible for the weird shit we do.

[658] Yeah, a friend of mine, his daughter got kicked out of college because of something she posted in high school.

[659] And the shittiest thing for him, but great for my kids, great lesson.

[660] Look over there and you don't want that to happen to you.

[661] Yeah.

[662] Okay.

[663] So in the chat logs was a useful information like Brad Crumbs of what of information that you can then pull that.

[664] Yeah, great evidence and stuff.

[665] You know, I mean, obviously evidence too.

[666] Yeah, a lot of evidence.

[667] Here's a sale of this much heroin because, you know, Ross ended up getting charged with czar status on certain things.

[668] And that's, it's a certain weight in each type of drug and that you had like, I think it's four or five employees of your empire, and that you made more than $10 million.

[669] And so it's, it's, you know, it's just like what the narco track feeders get charged with or, you know, anybody out of Columbia, you know, and so.

[670] And that was primarily what he was charged with doing when he was arrested, is the drug.

[671] Yeah, and he got charged with some of the hacking tools, too.

[672] Okay.

[673] Because he's in prison, what, for two life sentences plus 40 years.

[674] And no possibility of parole?

[675] in the federal system there's no possibility of parole when you have life the only way you get out is if the president pardons you there's always a chance there is i think it was close uh i heard i heard rumors there was close well right so it depends given it's fascinating but given the political the ideological ideas that he represented and espoused it's it's not out of the realm of possibility yeah i mean i've been asked before who you know who does he get out of prison first or to Snowden come back into America and I don't know I have no idea just became a Russian citizen I saw that and I've heard a lot of good weird theories about that one well actually on another tangent let me ask you do you think Snowden is a good or a bad person a bad person can you make the case that he's a bad person there's ways of being a whistleblower and there's there's rules set up on how to do that he He didn't follow those rules.

[676] I mean, they, you know, I'm red, white, and blue, so I'm pretty, you know.

[677] So you think his actions were anti -American?

[678] I think the results of his actions were anti -American.

[679] I don't know if his actions were anti -American.

[680] Do you think he could have anticipated the negative consequences of his action?

[681] Should we judge him by the consequences or the ideals of the intent of his actions?

[682] I think we all get to judge him by best our own beliefs, but I believe what he did was wrong.

[683] Can you steal a man in the case that he is actually a good person and good for this country for the United States of America as a flag bearer for the whistleblowers, the check on the power of government?

[684] Yeah, I mean, I'm not big government type guy, you know, so, you know, even that sounds weird coming from a government guy for so many years.

[685] But there's rules in place for a reason.

[686] I mean, he put, you know, some of our best capabilities.

[687] He made them publicly available.

[688] They really kind of set us back in the, and this is in my world at all, but the offensive side of cybersecurity.

[689] Right.

[690] So he revealed stuff that he didn't need to reveal in order to make the point.

[691] Correct.

[692] So if you can imagine a world where he leaked stuff that revealed the mass surveillance efforts and not reveal other.

[693] stuff.

[694] Yeah.

[695] It's the mass surveillance.

[696] I mean, that's the thing that, of course, there's in the interpretation of that, there's fearmongering, but at the core, that was a real shock to people that it's possible for government to collect data at scale.

[697] It's surprising to me that people are that shocked by it.

[698] Well, there's conspiracies, and then there's, like, actual evidence that that is happening.

[699] I mean, it's a real, there's a lot of reality that people ignore.

[700] But when it hits you in the face, you realize, holy shit, we're living in a new world.

[701] This is, this is the new reality, and we have to deal with that reality.

[702] Just like you work in cybersecurity, I think it really hasn't hit most people.

[703] How fucked we all are in terms of cybersecurity.

[704] Okay, let me rephrase that.

[705] How many dangers there are in a digital world.

[706] how much under attack we all are and how more intense the attacks are getting and how difficult the defense is and how important it is and how much we should value it and all the different things we should do at the small and large scale to defend.

[707] Like most people really haven't woken up, they think about privacy from tech companies.

[708] They don't think about attacks, cyber attacks.

[709] People don't think they're a target and that message definitely has to get out there.

[710] I mean, you know, if you have a voice, you're a target.

[711] if the place you work you might be a target you know so your husband might work at some place you know and they because now people are working from home so they're going to target you know target you to get access to to his network in order to get in well in that same way the idea that the u .s. government or any government could be doing mass surveillance on its citizens is um is one that was a wake -up call because you could imagine the ways in which that could um be like you could abuse the power of that to control a citizenry for political reasons and purposes.

[712] Absolutely.

[713] You know, you could abuse it.

[714] I think during the part of the Snowden League saw that two NSA guys were monitoring like their girlfriends.

[715] And there's rules in place for that.

[716] Those people should be punished for abusing that.

[717] But how else are we going to hear about, you know, terrorists that are in the country talking about birthday cakes?

[718] And, you know, that was a case where that that was the trip word that, you know, we're going to go bomb New York City's subway yeah it's complicated but it just feels like there should be some balance of transparency there should be a check in that power because like you you know in the name of the war on terror you can sort of sacrifice if there is a trade -off between security and freedom but it just feels like there's a giant slippery slope on the sacrificing of freedom in the name of security I hear you and you know we live in a world where well I live in a world where I had to tell you exactly when I arrested someone.

[719] I had to write a 50 -page document of how I arrested you and all the probable cause I have against you and all that.

[720] Well, you know, bad guys are reading that.

[721] They're reading how I caught you and they're changing their way they're doing things.

[722] They're changing their MO.

[723] You know, they're doing it to be more secure.

[724] If, you know, we tell people how we're monitoring, you know, what we're surveilling, we're going to lose that.

[725] I mean, the terrorists are just going to go a different way.

[726] And I'm not trying to, again, I'm not big government.

[727] I'm not trying to say that, you know, it's cool that we're monitoring, the U .S. government's monitoring everything, you know, big text monitoring everything.

[728] They're just monetizing it versus possibly using it against you.

[729] But there is a balance.

[730] And those 50 pages, they have a lot of value.

[731] They make your job harder, but they prevent you from abusing the power of the job.

[732] Yeah.

[733] There's a balance.

[734] Yeah.

[735] That's a tricky balance.

[736] so the chat logs in Iceland gave you evidence of the heroin and all the large scale czar level drug trading what else did it give you in terms of the how to catch you gave the same structure so the onion name was actually running on a server in France so if you like it only communicated through a back channel a VPN to connect to the Iceland server.

[737] There was a Bitcoin, like kind of vault server that was also in Iceland.

[738] And I think that was so that the admins couldn't get into the bitcoins, the other admins that were hired to work on the site.

[739] So you could get into the site, but you couldn't touch the money.

[740] Only Ross had access to that.

[741] And then, you know, another big mistake on Ross's part is he had the backups for everything at a data center in Philadelphia.

[742] don't put your infrastructure in the United States.

[743] I mean, again, let's not make a playbook, but, you know.

[744] I think these are law -hanging fruit that people of competence would know already.

[745] I agree.

[746] But it's interesting that he wasn't competent enough to make, so he was incompetent in certain ways.

[747] Yeah, I don't think he was a mastermind of setting up an infrastructure that would protect his online business because, you know, keeping chat logs, keeping a diary, putting infrastructure where it shouldn't be.

[748] Bad decisions.

[749] How did you figure out that he's in San Francisco?

[750] So we had that part with Jared that he was on the West Coast.

[751] And then...

[752] Who again is Jared?

[753] Jared Dayagan was a partner in...

[754] He was a DHS agent.

[755] Worked for HSI Homeland Security Investigations in Chicago.

[756] He started his Silk Road investigation because, he was working at O 'Hare and a weird package came in, coming to find out, he traced it back to Silk Road.

[757] So he started working at a Silk Road investigation long before I started my case, and he made his way up undercover all the way to be an admin on Silk Road.

[758] So he was talking to Ross on a Jabber server, a private Jabber server, private chat communication server, and we noticed that Ross's time zone on that Jabber server was set to the West Coast.

[759] So we had Pacific time on there.

[760] So we had a region.

[761] 1 .24th of the world was covered of where we thought he might be.

[762] And from there, how do you get to San Francisco?

[763] There was another guy, an IRS agent that was part of the team.

[764] And he used a powerful tool to find his clue.

[765] He used the world of Google.

[766] He simply just went back and Googled around for Silk Road at the time it was coming up and found some posts on.

[767] on like some help forums that this guy was starting an onion website and wanted some cryptocurrency help.

[768] And if you could help him, please reach out to ross .aulbrick at gmail .com in my world.

[769] That's a clue.

[770] Okay, so that's as simple as that.

[771] Yeah, and the name he used on that post was Frosty.

[772] Yeah, so you have to connect Frosty and other uses in Frosty and here's a Gmail and the Gmail has the name.

[773] The Gmail posted that I need help under the name Frosty on this forum.

[774] So what's the connection of Frosty elsewhere?

[775] The person logging into the Philadelphia backup server, the name of the computer was Frosty.

[776] Another clue in my world.

[777] And that's it.

[778] The name is there.

[779] The connection to the Philadelphia server and then to Iceland is there.

[780] And so the rest is small details in terms of, or is there interesting details?

[781] No, I mean, there's some electronic surveillance that find Ross Allberg living in a house and is there, you know, is a computer at his house attaching to, you know, does it have tour traffic at the same time that DPR's on?

[782] Another big clue.

[783] Matching up time frames.

[784] Again, just putting your email out there, putting your name out there like that, like what I see from that, just at the scale of that market, what I, what I, what, is.

[785] It just makes me wonder how many criminals are out there that are not making these law -hanging fruit mistakes and are still successfully operating.

[786] To me, it seems like you could be a criminal.

[787] It's much easier to be a criminal on the internet.

[788] What else to you is interesting to understand about that case of us and Silk Road and just the history of it from your own relationship with it, from a cybersecurity perspective, from an ethical perspective, all that kind of stuff.

[789] Like, when you look back, what's interesting to you about that case?

[790] I think my views on the case have changed over time.

[791] I mean, it was my job back then.

[792] So I just looked at it as of, you know, I'm going after this.

[793] I sort of made a name for myself in the Bureau for the anonymous case.

[794] And then this one was just, I mean, this was a bigger deal.

[795] I mean, they flew me down to D .C. to meet with the director about this case.

[796] The president of the United States was going to announce this case.

[797] case, the arrest.

[798] Unfortunately, the government shut down two days before.

[799] So it was just us.

[800] And that's really the only reason I had any publicity out of it is because the government shut down.

[801] And the only thing that went public was that affidavit with my signature at the end.

[802] Otherwise, it would have just been the attorney general and the president announcing the rest of this big thing.

[803] And you wouldn't have seen me. Did you understand that this was a big case?

[804] Yeah, I knew it the moment.

[805] Yeah, they knew it the time.

[806] Was it because of the scale of it or what it stood for?

[807] I just knew that the public was going to react in a big way.

[808] Like the media was not.

[809] Did I think that it was going to be on the front page of every newspaper and the day after the arrest?

[810] No. But I could sense it.

[811] Like I went like three or four days without sleep.

[812] When I was out in San Francisco to arrest Ross, I had sent three guys to Iceland to, so it was a three -prong approach for the takedown.

[813] It was get Ross, get the bitcoins, and seize the site.

[814] Like we didn't want someone else taking control the site.

[815] And we wanted that big splash of that banner.

[816] Look, the government found this site.

[817] Like, you might not want to think about doing this again.

[818] And you were able to pull off all three?

[819] Maybe that's my superpower.

[820] I'm really good about putting smarter people than I am together and on the right things.

[821] It's the only way to do it.

[822] In the business I formed, that's what I did.

[823] I hired only smarter people than me. And, you know, I'm not that smart, but, you know, smart enough to know who the smart people are.

[824] The team was able to do all three.

[825] Yeah, we were able to get all three done.

[826] Yeah, and the one guy, one of the guys, The main guys I sent to Iceland, man, he was so smart.

[827] Like, I sent another guy from the FBI to France to get that part, and he couldn't do it.

[828] So the guy in Iceland did it from Iceland.

[829] They had to pull some stuff out of memory on a computer.

[830] You know, it's live process stuff.

[831] I'm sure you've done that before, but I'm sure you've did.

[832] Look what you're doing.

[833] This is like a multi -layer interrogation going on.

[834] Was there a concern that somebody else would step in and control the site?

[835] Absolutely.

[836] We didn't have insight on who exactly had control.

[837] So it turns out that Russ had like dictatorial control.

[838] So it wasn't easy to delegate to somebody else.

[839] He hadn't.

[840] I think he had some sort of ideas.

[841] I mean, his diary talked about walking away and giving it to somebody else, but he didn't, he couldn't give up that control on anybody, apparently.

[842] Which makes you think that power corrupts and his ideals were not as strong as he espoused about, because if it was about the freedom of being able to buy drugs, if you want to, then he surely should have found ways to delegate that power.

[843] Well, he changed over time, you could see it in his writings, that he changed.

[844] Like, so people argue back and forth that there was never murders on Silk Road.

[845] When we were doing the investigation, to us, there were six murders.

[846] so there was the way we see him saw him at the time was Ross ordered people to be murdered um you know some of people stole from him and all that it was sort of an evolution from oh man I can't deal with this I can't do it it's too much to the last one was like the guy said well he's got three roommates uh it's like oh we'll kill them too was that ever proven in court no the murders never went forward because there was some some stuff problems in that case.

[847] So there was a separate case in Baltimore that they had been working on for a lot longer.

[848] And so, you know, during the investigation, that caused a bunch of problems because now we have multiple federal agencies case against the same thing.

[849] How do you decide not to push forward the murder investigations?

[850] So there was a deconfliction meeting that happened in D .C. I didn't happen to go to that meeting, but Jared went, this is before I ever knew Jared.

[851] And we have like televisions where we can just sit in a room and sit in on the meeting but it's all you know secured network and all that so we can talk openly about secure things and we sat in on the meeting and people just kept saying the term sweat equity.

[852] I've got sweat equity meaning that they had worked on the case for so long that they deserve to take them down and by this time you know no one knew about us, but we told them at the meeting that, well, we had found the server and we have a copy of and we have the infrastructure.

[853] And these guys had just had communications, undercovers.

[854] They didn't really know what was going on.

[855] And this wasn't my first deconfliction meeting.

[856] We had a huge deconfliction meeting during the anonymous case.

[857] What's the deconfliction meeting?

[858] Agents within your agency or other federal agencies have an open investigation that if you expose your case or took down your case would hurt their case or the other.

[859] So you kind of have a, it's like the rival gangs meet at the table in a smoke -filled room.

[860] Less bullets at the end, but yes.

[861] Oh, boy, with the sweat equity.

[862] Yeah.

[863] I mean, there's careers at stake, right?

[864] Yeah.

[865] You hate that idea.

[866] Yeah.

[867] I mean, why would you, why is that a stake?

[868] Just because you've worked on it long enough, longer than I have, that means you get, you, you did better.

[869] Yeah.

[870] That's insane to me. That's rewarding bad behavior.

[871] and so that one of the part of the sweat equity discussion was about murder and this was here's a chance to actually bust them given the data from Iceland and all that kind of stuff so why well they wanted us just to turn the data over to them to them yeah thanks thanks for getting us this far here it is I mean it came to the point where they sent us like they they had a picture of what they thought Ross was and it was an internet meme it really was a meme it was a photo that we could look up like it was insane All right.

[872] So there's different degrees of competence all across the world between different people.

[873] Yes.

[874] Okay.

[875] Does part of you regret because you push forward the heroin and the drug trade?

[876] We never got to the murder discussion.

[877] I mean, the only regretting is that the internet doesn't seem to understand.

[878] They just kind of blow that part off, that he literally paid people to have people murdered.

[879] It didn't result in a murder.

[880] And I thank God, no one resulted in.

[881] in a murder.

[882] But that's where his mind was.

[883] His mind and where he wrote in his diary was that I had people killed and here's the money.

[884] He paid it.

[885] He paid a large amount of bitcoins for that murder.

[886] So he didn't just even think about it.

[887] He actually took action, but the murders never happened.

[888] He took action by paying the money.

[889] Correct.

[890] And the people came back with results.

[891] He thought they were murdered.

[892] That said, can you understand the steel man on the case for the drug trade on Silk Road?

[893] Like making Can you make the case that it's a net positive for society?

[894] So there was a time period of when we found out the infrastructure and when we built the case against Ross.

[895] I don't remember.

[896] Six weeks, a month, two months, I don't know, somewhere in there.

[897] But then at Ross's ascendencing, there was a father that stood up and talked about his son dying.

[898] And I went back and kind of did the math.

[899] And it was between those time periods of when we knew we could shut it down.

[900] We could have pulled the plug on the server and gone.

[901] on.

[902] And when Ross was arrested, his son died from buying drugs on Silk Road.

[903] And I still think about that father a lot.

[904] But if we look at scale at the war on drugs, let's just even outside of Silk Road, do you think the war on drugs by the United States has caused, has alleviated more suffering or caused more suffering in the world?

[905] That might be above my pay scale.

[906] I mean, I understand the other side of the argument.

[907] I mean, people said that I don't have to go down to the corner to buy drugs.

[908] I'm not going to get shot on the corner buying drugs or something.

[909] I can just have them sent to my house.

[910] People are going to do drugs anyways.

[911] I understand that argument.

[912] From my personal standpoint, if I made it more difficult for my children to get drugs, that I'm satisfied.

[913] So your personal philosophy is that if we legalize all drugs, including heroin and cocaine, that that would not make for a better world.

[914] I don't.

[915] No, personally, I don't believe legalizing all drugs would make for a better world.

[916] Can you imagine that it would?

[917] Do you understand that argument?

[918] Sure.

[919] I mean, as I've gotten older, I like to see both sides of an argument.

[920] And when I can't see the other side, that's when I really like to dive into it.

[921] And I can see the other side.

[922] I can see why people would say that.

[923] But I don't want to be my race children in a world where drugs are just free for use.

[924] Well, and then the other side of it is, with Silk Road, did, you know, taking down Silk Road, did that increase or decrease the number of drug trading criminals in the world?

[925] It's unclear.

[926] Online, I think it increased.

[927] I think, you know, that's one of the things I think about a lot with Silk Road was that no one really knew.

[928] I mean, there was, you know, thousands of users.

[929] But then after that, it was on the front page of the paper, and there was millions of people knew about Tor and onion sites.

[930] It was an advertisement.

[931] I would have thought, I thought crypto was going to crash right after that.

[932] Like, I don't know, like, what people now see that bad people are doing bad things with crypto?

[933] That'll crash.

[934] Well, I'm obviously wrong on that one.

[935] And I thought, you know, Ross was sentenced to two life sentences plus 40 years.

[936] No one's going to start up.

[937] These dark markets exploded after that.

[938] You know, some of them started as, you know, opportunistic.

[939] I'm going to, you know, take those escrow accounts and I'm going to steal all the money that came in, you know, they were for that.

[940] But, you know, but there were a lot of dark markets that popped up after that.

[941] Now we put the playbook out there.

[942] Yeah.

[943] Yeah.

[944] But, and also there's a case for, do you ever think about not taking down, if you have not taken down Silk Road, you could use it because it's a market.

[945] It itself is not necessarily the primary criminal organization.

[946] It's a market for criminals.

[947] So it could be used.

[948] to track down criminals in the physical world.

[949] So if you don't take it down, given that it was, you know, the central, how centralized it was, it could be used as a place to find criminals, right?

[950] So the dealers, the dealers, the drug dealers?

[951] Yeah.

[952] So if you have the cartels, start getting to involve, you go after the dealers.

[953] It would have been very difficult.

[954] Because of tour and all that.

[955] Because all the protections, the anonymity.

[956] Decloking all that would have been, drastically more difficult and a lot of people in upper management of the FBI didn't have the appetite of running something like that that would have been the FBI running a drug market how many how many kids how many fathers would have to come in and said my kid bought while the FBI was running a site a drug site my kid died so I didn't know anybody in the FBI in management they would have the appetite to let us run what was happening on Silk Road um you know because remember at that time we still believe in six people are dead.

[957] We're still investigating, where are all these bodies?

[958] You know, that's pretty much why we took down Ross when we did.

[959] I mean, we had to jump on it fast.

[960] What else can you say about this complicated world that has grown of the dark web?

[961] I don't understand it.

[962] It would have been something for me. I thought it was going to collapse.

[963] But, I mean, it's just gotten bigger in what's going out there.

[964] Now, I'm really surprised that it hasn't grown into other networks.

[965] or people haven't developed other networks, but...

[966] You mean, like, instead of Tor?

[967] Yeah, Tor's still the main one out there.

[968] I mean, there's a few others, and I'm not going to put an advertisement out for them.

[969] But, you know, I thought that market would have grown.

[970] Yeah, my sense was when I interacted with Tor, it was that there's huge usability issues.

[971] But that's for, like, legal activity.

[972] Yeah.

[973] Because, like, if you care about privacy, it's just not as good of a browser.

[974] Like, to look at stuff.

[975] No, it's way too slow.

[976] It's way too slow.

[977] But I mean, you can't even, like, I know some people would use it to, like, view movies.

[978] Like, Netflix, you can only view certain movies in certain countries.

[979] You can use it for that, but it's too slow even for that.

[980] Were you ever able to hold in your mind the landscape of the dark web?

[981] Like, what's going on out there?

[982] To me, as a human being, it's just difficult to understand the digital world.

[983] Like, these anonymous usernames, like doing anonymous.

[984] activity, it's just, it's hard to, what am I trying to say?

[985] It's hard to visualize it in the way I can visualize, like I've been reading a lot about Hitler.

[986] I can visualize meetings between people, military strategy, deciding on certain evil atrocities, all that kind of stuff.

[987] I can visualize the people, there's agreements, hands, handshakes, stuff signed, groups built, like in the digital space like with bots with anonymity anyone human can be multiple people uh it's just yeah it's all lies it's all lies like yeah it feels like i can't trust anything no you can't you honestly can't and like you can't talk to two different people and it's the same person like like there's so many different you know hector had so many different identities online the you know uh of things that the you know the lies to each other i mean he lied to people inside his group uh just to use another name to spy on, make sure what they, you know, we're talking shit behind his back or weren't doing anything.

[988] It's all lies and people that can keep all those lies straight.

[989] It's unbelievable to me. Ross Albrecht represents the very early days of that.

[990] That's why the competence wasn't there.

[991] Just imagine how good the people are now, the kids that grow up.

[992] Oh, they've learned from his mistakes.

[993] Just the extreme competence.

[994] You just see how good people are at video games, like the level of play in terms of video games.

[995] Like, I used to think I sucked.

[996] And now I'm not even, like, I'm not even in the, like, consideration of calling myself shitty at video games.

[997] I'm not even, I'm like non -existent.

[998] I'm like the mold.

[999] Yeah, I stop playing me. It's so embarrassing.

[1000] It's like wrestling with your kid and he finally beats you.

[1001] He's like, well, fuck that.

[1002] I'm not wrestling my kid ever again.

[1003] And in some sense, hacking at its best and its worst is a kind of game, and you can get exceptionally good at that kind of game.

[1004] And you get the accolations of it.

[1005] I mean, there's, you know, there's power that comes along.

[1006] If you have success, look at the kid that was hacking into Uber and Rockstar Games.

[1007] He put it out there that he was doing it.

[1008] I mean, he used the name, whatever hacked into Uber was his screen name.

[1009] He was very proud of it.

[1010] I mean, one, Bill and evidence against himself.

[1011] But, you know, he wanted that slap on the back.

[1012] Like, look at what a great hacker you are.

[1013] Yeah.

[1014] What do you think is in the mind of that guy?

[1015] What do you think is in the mind of Ross?

[1016] Do you think they see themselves as good people?

[1017] Do you think they acknowledge the bad they're doing onto the world?

[1018] So that Uber Hacker, I think that's just youth, not realizing what consequences are, I mean, based on his actions.

[1019] Ross was a little bit older.

[1020] I think Ross truly is a libertarian.

[1021] He truly had his beliefs that he could provide the gateway for other people to live that libertarian lifestyle and put in their body what they want.

[1022] I don't think that was a front or a lie.

[1023] What's the difference between DPR and Ross?

[1024] He said, like, I have never met Ross until I have only had those two days of worth of interaction.

[1025] Yeah.

[1026] It's just interesting, given how long you've chased him, and then having met him, what was the difference to you as a human being?

[1027] He was a human being.

[1028] He was, he was, you know, he was an actual person.

[1029] He was nervous when we arrested him.

[1030] So one of the things that I learned through my law enforcement career is if I'm going to be the case agent, I'm going to be the one in charge of, you know, deal with this person, I'm not putting handcuffs on him.

[1031] Something else is going to do that.

[1032] Like, I'm going to be there to help him, you know, I'm your conduit to help.

[1033] And so, you know, right after someone's arrested, you obviously have had him down for weapons to make sure for everybody's safety.

[1034] But then I just put my hand on their chest.

[1035] Just feel their heart, feel their breathing.

[1036] You're going to, I'm sure it's the scariest day.

[1037] But then to have that human contact kind of settles people down.

[1038] And you kind of let's start thinking about this.

[1039] I'm going to tell you, you know, I'm going to be open and honest with you.

[1040] You know, there's a lot of cops out there and federal agents cops that just go to the hard -ass tactic.

[1041] You don't get very far with that.

[1042] You don't get very far being a mean asshole to somebody, you know, be compassionate, be human.

[1043] And it's going to go a lot further.

[1044] So given everything he's done, you were still able to have compassion for him.

[1045] Yeah.

[1046] We took him to the jail, and we, so it was after hours, so he didn't get to see a judge that day.

[1047] So we stuck, we stuck him in the San Francisco jail.

[1048] I hadn't slept for about four days because I was dealing with people in Iceland, bosses in D .C., bosses in New York.

[1049] And I was in San Francisco.

[1050] So time frame, like the Iceland people were calling me when I was supposed to be sleeping.

[1051] It was insane.

[1052] But I still went out that night.

[1053] Well, Ross sat in jail and bought him breakfast.

[1054] I said, what do you want for breakfast?

[1055] I'll have a nice breakfast for you because we picked him up in the morning and took him over to the FBI to do the FBI booking, the fingerprints and all that.

[1056] And I got him breakfast.

[1057] I mean, and you don't get paid back for that sort of thing.

[1058] I'm not looking, but out of my own...

[1059] Did he make special requests for breakfast?

[1060] Yeah, he asked for certain things.

[1061] Can you mention?

[1062] Is that top secret FBI?

[1063] Not top secret.

[1064] I think you wanted some granola bars.

[1065] And, you know...

[1066] Yeah.

[1067] But, I mean, he already had lawyered up.

[1068] So we, you know, which is his right, he can do that.

[1069] So I knew we were going to work together, you know, like I did with Hector.

[1070] But, I mean, this is the guy's last day.

[1071] Most of the conversations have to be them with lawyers.

[1072] From that point on, I can't question him when he asked for a lawyer or if I did, it couldn't be used against him.

[1073] So we just had a conversation where I talked to him.

[1074] You know, he could, you know, could say things to me, but then I would remind him that he asked for a lawyer and he'd have to waive that and all that.

[1075] But we didn't talk about his case so much.

[1076] We just talked about, like, human beings.

[1077] Did he, with his eyes?

[1078] with his words, reveal any kind of regret or did you see a human being changing, understanding something about themselves in the process of being caught?

[1079] No, I don't think that.

[1080] I mean, he did offer me $20 million to let him go when we were driving to the jail.

[1081] Oh, no. And I asked him what I was going to, we were going to do with the agent that sat in the front seat.

[1082] The money really broke him, huh?

[1083] I think so.

[1084] I think he kind of got caught up in how much money it was.

[1085] And how, you know, when crypto started, it was pennies.

[1086] And by the time he got arrested, it was 120 bucks.

[1087] And, you know, 177 ,000 Bitcoins, even today, you know, that's a lot of Bitcoin.

[1088] So you really could have been, if you continued to be one of the richest people in the world.

[1089] I possibly could have been if I took that, that $20 million, then.

[1090] I could have been living, we could have this conversation in Venezuela.

[1091] In a castle, in a palace.

[1092] Yeah, until it runs out.

[1093] and then the government storms the castle.

[1094] Yeah.

[1095] Have you talked to Russ since?

[1096] No. No. I'd be open to it.

[1097] I don't think he probably wants to hear from me. And do you know where, in which prison he is?

[1098] I think he's somewhere out in Arizona.

[1099] I know he was in the one next to Supermax for a little while, like the high security one that shares the fence with Supermax, but I don't think he's there anymore.

[1100] I think he's out in Arizona.

[1101] I haven't seen in a while.

[1102] I wonder if you can do interviews in prison.

[1103] that would be nice some some people are allowed to so i don't i've not seen an interview with him i know people have wanted to interview him about books and that sort of thing right because the story really blew up did it surprise to you how much the story and many elements of it blew up movies it did surprise like it my wife's uncle who i didn't i've been married my wife for 22 years now i don't think he knew my name and he was excited about that he reached out when when silk road came out so So he, you know, that was surprising to say.

[1104] Did you think the movie was, on the topic was good?

[1105] I didn't have anything to do with that movie.

[1106] I've watched it once.

[1107] It was kind of cool that Jimmy Simpson, you know, was my name in the movie.

[1108] But outside of that, I thought it sort of missed the mark on some things.

[1109] When Hollywood, I don't think they understand what's interesting about these kinds of stories.

[1110] And there's a lot of things that are interesting and they missed all of them.

[1111] So, for example, I recently talked to John Carmack, who is a, world -class developer and so on.

[1112] So Hollywood would think that the interesting thing about John Carmack is some kind of like shitty, like a parody of a hacker or something like that.

[1113] They would show like really crappy, like emulation of some kind of Linux terminal thing.

[1114] The reality is like the technical details for five hours with him, for 10 hours with him, is what people actually want to see.

[1115] Even people that don't program, they want to see a brilliant mind, the details, that they're not, even if they don't understand all the details, they want to have an inkling of the genius there.

[1116] That's just one way, I'm saying, that you want to reveal the genius, the complexity of that world in interesting ways, and to make a Hollywood almost parody caricature of it, it just destroys the spirit of the thing.

[1117] So one, the Operation FBI is fascinating, just tracking down these people, on the cybersecurity front, it's fascinating.

[1118] The other is just how you run tour, how you run this kind of organization, the trust issues of the different criminal entities involved, the anonymity, the low -hanging fruit, the being shitty at certain parts on the technical front.

[1119] All of those are fascinating things.

[1120] That's what a movie should reveal.

[1121] It should probably be a series, honestly, a Netflix series in a movie.

[1122] Yeah, one of an FX show or something like that, Kind of gritty, you know.

[1123] Yeah, yeah, gritty.

[1124] Exactly, gritty.

[1125] I mean, shows like Chernobyl from HBO made me realize, okay, you can do a good job of a difficult story and, like, reveal the human side, but also reveal the technical side and have some deep, profound understanding on that case, on the bureaucracy of a, of a Soviet regime.

[1126] In this case, you could reveal the bureaucracy, the chaos of a criminal organization, of a law enforcement organization.

[1127] I mean, there's so much to, like, explore.

[1128] It's fascinating.

[1129] Yeah, I like Chernobyl.

[1130] When I rewatch it, I can't watch episode three, though.

[1131] The animal scene, the episode, they go around and shooting all the dogs and all that.

[1132] I got to skip that part.

[1133] You're a big softy, aren't you?

[1134] I really am.

[1135] I'm sure I'll probably cry at some point.

[1136] I love it.

[1137] I love it.

[1138] Listen.

[1139] Don't get me talking to that episode you made about your grandmother.

[1140] Oh, my God.

[1141] That was rough.

[1142] Just to linger on this ethical versus legal question, what do you think about people like Aaron Schwartz?

[1143] I don't know if you're familiar with him, but he was somebody who broke the law in the name of an ethical ideal.

[1144] He downloaded and released academic publications that were behind a paywall.

[1145] And he was arrested for that and then committed suicide.

[1146] And a lot of people see him, certainly in the MIT community, but throughout the world as a hero.

[1147] because you look at the way knowledge, scientific knowledge is being put behind paywalls, it does seem somehow unethical.

[1148] And he basically broke the law to do the ethical thing.

[1149] Now you could challenge it, maybe it is unethical, but there's a gray area and to me at least, it is ethical, to me at least he is a hero.

[1150] because I'm familiar with the paywall created by the institutions that hold these publications, they're adding very little value.

[1151] So it is basically holding hostage the work of millions of brilliant scientists for some kind of, honestly, a crappy capitalist institution.

[1152] Like, they're not actually making that much money.

[1153] It doesn't make any sense to me. should, to me, it should all be open, public access.

[1154] There's no reason it shouldn't be all publication.

[1155] He stood for that ideal and was punished harshly for it.

[1156] That's the other criticism.

[1157] It's too harshly.

[1158] And of course, deeply, unfortunately, that also led to his suicide because he was also tormented on many levels.

[1159] I mean, are you familiar with him?

[1160] What do you think about that line between what is legal?

[1161] and what is ethical?

[1162] So it's a tough case.

[1163] I mean, the outcome was tragic, obviously.

[1164] Unfortunately, when you're in law enforcement, you have to, your job is to enforce the laws.

[1165] I mean, it's not, if you're told that you have to do a certain case, you know, and there is a violation of, at the time, you know, 18 U .S .C. 1030, computer hacking.

[1166] You have to press forward with that.

[1167] I mean, You have to charge the, you bring the case to the university office and whether they're going to press charges or not, you know, you can't, you can't really pick and choose what you press and don't press for it.

[1168] I never felt that, at least that flexibility and not in the FBI.

[1169] I mean, maybe when you're a street cop and you pull somebody over, you can let them go with a warning.

[1170] So in the FBI, you're sitting in a room, but you're also, you're also a human being.

[1171] You have compassion.

[1172] You arrested Ross and the hand on the chest.

[1173] I mean, that's a human thing.

[1174] you're right so there's a but i'm i can't be the jury for whether it was a good hack or a bad hack it's all someone a victim has come forward and said we're the victim of this and i and i agree with you because again i let the basis of the internet was to share academic thought yeah i mean that's where the internet was born but it's not it's not up to you so the the role of the fbi is to enforce the law correct and you know there's a there's a limited number of tools on our on our Batman belt that we can use.

[1175] You know, not to get into all the aspects of the Trump case and Mar -A -Lago and the documents there.

[1176] I mean, the FBI has so many tools they can use and a search warrant is the only way they could get in there.

[1177] I mean, that's it.

[1178] It's a, you know, there's no other legal document or legal way to enter and get those documents.

[1179] What do you think about the FBI and Mar -A -Lago and the FBI taking the documents for Donald Trump?

[1180] You know, it's a tough spot.

[1181] It's a really tough spot.

[1182] The FBI has gotten a lot of black guys, you know, recently.

[1183] And I don't know if it's the same FBI that I remember when I was there.

[1184] Do you think they deserve it in part?

[1185] Was it done clumsily?

[1186] Their rating of the former president's residence?

[1187] It's tough.

[1188] You know, because, again, they're only limited to what they're legally allowed to do.

[1189] And a search warrant is the only legal way.

[1190] of doing it.

[1191] I have my personal and political views on certain things, you know, and I think it might, yes, it might be surprising to some where those political points stand, but, uh, you told me offline that you're a hardcore communist that was very, very surprising to me. Well, that's only you tried to bring me into the communist party.

[1192] Exactly.

[1193] I was trying to recruit you.

[1194] It's giving you all kinds of flyers.

[1195] Okay, but you said, like, you know, people in FBI are just following the law, but there's a chain of command and so on.

[1196] What do you think about the conspiracy theory is that people, some small number of people inside the FBI conspired to undermine the presidency of Donald Trump?

[1197] If you would ask me when I was inside and before all this happened, I would say it could never happen.

[1198] I don't believe in conspiracies.

[1199] You know, there's too many people involved.

[1200] that something's going to come out with some sort of information.

[1201] But, I mean, from the more of the stuff that comes out, it's surprising that, you know, agents are being fired because of certain actions they're taking inside and being dismissed because of politically motivated actions.

[1202] So do you think it's explicit or just pressure?

[1203] Do you think there could exist just pressure at the higher -ups that has a political leaning and you kind of maybe don't explicitly order any kind of thing, but just kind of pressure people to lean one way or the other and then create a culture that leans one way or the other based on political leanings?

[1204] You would really, really hope not.

[1205] But, I mean, that seems to be the narrative that's being written.

[1206] But when you were operating, you didn't feel that pressure.

[1207] Man, I was such as a low level.

[1208] You know, I had no aspirations of being a boss.

[1209] I wanted to be a case agent my entire life.

[1210] So you love the puzzle of it, the chase.

[1211] I love solving things, yeah.

[1212] To be management and manage people and all that.

[1213] like, no desire whatsoever.

[1214] What do you think about Mark Zuckerberg on Joe Rogan's podcast saying that the FBI warned Facebook about potential foreign interference?

[1215] And then Facebook inferred from that that they're talking about Hunter Biden laptop story and thereby censored it.

[1216] What do you think about that whole story?

[1217] Again, you asked me when I was in the FBI, I wouldn't believe it, from being on the inside, and I wouldn't believe these things, but there's a certain narrative being written that is surprising to me that the FBI is involved in these stories.

[1218] But the interesting thing there is, the FBI is saying that they didn't really make that implication.

[1219] They're saying that there's interference activity happening.

[1220] Just watch out.

[1221] And it's a weird relationship between FBI and Facebook.

[1222] You could see from the best possible interpretation that the FBI just wants Facebook to be aware because it is a powerful platform, a platform for viral spread of misinformation.

[1223] So in the best possible interpretation of it, it makes sense for API to send some information saying, like we were seeing some shady activity.

[1224] Absolutely.

[1225] But it seems like all of that somehow escalated to a political interpretation.

[1226] I mean, yeah, it sounded like there was a wink, wink with it.

[1227] I don't know if Mark meant for that to be that way.

[1228] You know, like, again, are we being social engineered?

[1229] Or was that a true, you know, expression that Mark had?

[1230] And I wonder if the wink -wink is direct or it's just culture.

[1231] You know, maybe certain people responsible on the Facebook side and have a certain political lien.

[1232] And then certain people on the FBI side have a political lien when they're interacting together.

[1233] And it's like literally has nothing to do with a giant conspiracy theory, but just with a culture that has a particular political lean during a particular time in history.

[1234] And so, like, maybe it could be Hunter Biden laptop one time, and then it could be whoever, Donald Trump Jr.'s laptop another time.

[1235] It's a tough job.

[1236] I mean, if you're the liaison, if you're the FBI's liaison to Facebook, you know, there's certain people that I'm sure they were offered a position at some point.

[1237] It seems, you know, there's FBI agents that go, I know a couple that's gone to Facebook.

[1238] this is a really good agent that now leads up their child exploitation stuff um another squad mate runs their internal investigations uh both great investigators so you know there's good money especially when you're an fbi agent that's capped out at a you know a 1310 or whatever pay scale you're capped out at um it's it's alluring to be you know maybe want to please them and uh and and And over time, that corrupts.

[1239] I think there has to be an introspection in tech companies about the culture that they develop, about the political ideology, the bubble.

[1240] It's interesting to see that bubble.

[1241] Like I've asked myself a lot of questions.

[1242] I've interviewed the Pfizer CEO, what seems now a long time ago.

[1243] And I've gotten a lot of criticism, the positive comments, but also criticism from that conversation.

[1244] And I did a lot of soul searching about the kind of bubbles we have in this world.

[1245] And it makes me wonder, pharmaceutical companies, they all believe they're doing good.

[1246] And I wonder, because the ideal they have is to create drugs that help people and do so at scale.

[1247] and it's hard to know at which point that can be corrupted and it's hard to know when it was corrupted and if it was corrupted and where which drugs and which companies and so on.

[1248] And I don't know.

[1249] I don't know that complicated.

[1250] It seems like inside a bubble you can convince yourself if anything is good.

[1251] People inside the Third Reich regime were able to convince themselves, I'm sure many.

[1252] Just Bloodlands.

[1253] There's another book.

[1254] I've been recently reading about it.

[1255] And the ability of humans to convince they're doing good when they're clearly murdering and torturing people in front of their eyes is fascinating.

[1256] They're able to convince themselves they're doing good.

[1257] It's crazy.

[1258] Like there's not even an inkling of doubt.

[1259] Yeah, I don't know what to make of that.

[1260] So it has taught me to be a little bit more careful when I enter into different bubbles to be, skeptical about what's taken as an assumption of truth.

[1261] Like you always have to be skeptical about like what's assumed as true.

[1262] Is it possible it's not true?

[1263] You know, if you're doing, if you're talking about the America, it's assumed that, you know, in certain places that surveillance is good.

[1264] Well, let's question that that assumption.

[1265] Yeah.

[1266] And I also, it inspired me to question my own assumptions that I hold this true.

[1267] Constantly.

[1268] Constantly.

[1269] It's tough.

[1270] But you don't grow.

[1271] I mean, do you want to be just static and not grow?

[1272] You have to question yourself on some of these things if you want to grow as a person.

[1273] Yeah, for sure.

[1274] Now, one of the tough things actually of being a public personality when you speak publicly is you get attacked all along the way as you're growing.

[1275] And I'm in part a big softie as well, if I may say.

[1276] say, and those heart, it hurts, it hurts, it hurts.

[1277] Do you pay attention to it?

[1278] Yeah, yeah, yeah, yeah, yeah.

[1279] It's very hard, like, I have two choices.

[1280] One, you can shut yourself all from the world and ignore it.

[1281] I never found that compelling, this kind of idea of, like, haters are going to hate.

[1282] Yeah.

[1283] Like, this idea that anyone with a big platform or anyone's ever done anything was always gotten hate.

[1284] You know, okay, maybe.

[1285] But like, I still want to be vulnerable, where my heart of my sleeve, really show myself, like, open myself to the world, really listen to people.

[1286] And that means every once in a while somebody will say something that touches me in a way that's like, what if they're right?

[1287] Do you let that hate influence you?

[1288] I mean, can you be bullied into a different opinion than you think you really are just because of that hate?

[1289] No, no. I believe not.

[1290] But it hurts.

[1291] In a a way that's hard to explain, like, yeah, it just, it gets to, like, it shakes your faith in humanity, actually, is probably why it hurts.

[1292] Like, people that call me a Putin apologist or Zelensky apologist, which I'm currently getting almost an equal amount of, but it hurts.

[1293] It hurts because I, it hurts because it, like, it damages slightly my faith in humanity to be able to see the love that connects us and then to see that I'm trying to find that.

[1294] And that's, I'm doing my best in the limited capabilities I have to find that.

[1295] And so to call me something like a bad actor, essentially, from what perspective, it just makes me realize, well, people don't have empathy and compassion for each other.

[1296] It makes me question that for a brief moment.

[1297] And that's like a crack, and it hurts.

[1298] How many people do this to your face?

[1299] Very few.

[1300] It's online e -muscles, man. I have to be honest, that it happens.

[1301] Because I've hung around with Rogan enough.

[1302] When your platform grows, there's people that will come up to Joe.

[1303] and say stuff to his face, that they forget, they still, they forget he's an actual real human being.

[1304] They'll make accusations about him.

[1305] So does that cause him to wall himself off more?

[1306] No, he's pretty gangster on that.

[1307] But yeah, it still hurts.

[1308] If you're human, if you really feel others, I think that's also the difference with Joe and me. He has a family that he deeply loves.

[1309] And that's an escape from the world for him.

[1310] There's a loneliness in me that I'm always longing to connect with people and with regular people, just to learn their stories and so on.

[1311] And so if you open yourself up that way, the things they tell you can really hurt in every way.

[1312] Like just having me going to Ukraine, just seeing so much loss and death, some of it is like is I mean unforgettably haunting not in some kind of political way activist way or who's right who's wrong way but just like man like so much pain you see it just stays with you when you see a human being bad to another human you can't get rid of that in your head you can't imagine that we can treat each other like that that's the hard part I think I mean, for me, it is.

[1313] When I saw parents, like, when I did the child exploitation stuff, when they rented their children out, they literally rented infant children out to others for sexual gratification.

[1314] Like, I don't know how a human being could do that to another human being.

[1315] And that sounds like the kind of thing you're going through.

[1316] I mean, I went through a huge funk when I did those cases afterwards.

[1317] I should have talked to somebody, but in the FBI, you have to keep that machismo up or they're going to take a gun away from you.

[1318] Well, I think that's examples of evil that's, like, the worst of human nature.

[1319] But just because I have...

[1320] War is just as bad.

[1321] I mean...

[1322] Somehow war, it's somehow understandable, given all the very intense propaganda that's happening.

[1323] So you can understand that there is love in the heart of the soldiers on each side, given the information they're given.

[1324] There's a lot of people on the Russian side believe they're saving these Ukrainian cities from Nazi occupation.

[1325] Now, there is stories.

[1326] There is a lot of evidence of people for fun murdering civilians.

[1327] Now, that is closer to the things you've experienced of evil embodied.

[1328] And I haven't interacted with that directly, with people who, for fun, murdered civilians.

[1329] But you know it's there in the world.

[1330] I mean, you're not naive to it.

[1331] Yes, but if you experience that directly, if somebody shot somebody for fun in front of me, that would probably break me. Like seeing it yourself, knowing that it exists is different than seeing it yourself.

[1332] Now, I've interacted with the victims of that, and they tell me stories, and you see their homes destroyed, destroyed for no good military.

[1333] reason it's civilians with civilian homes being destroyed that really lingers with you but some yeah the people that are capable of that that goes with the propaganda i mean if you go to build a story you have to you know you have to have on the other side you know the homes are going to be destroyed the non military targets are going to be destroyed to put in perspective i'm not sure a lot of people understand the deep human side or even the military strategy side of this war there's a lot of experts outside of the situation that are commenting on it with certainty and that kind of hurts me because i feel like there's a lot of uncertainty there's so much propaganda it's very difficult to know what is true um yeah so so my whole hope was to travel to ukraine to travel to russia to talk to soldiers to talk to leaders to talk to real people that have lost homes that have lost family members who this war has divided, who this war changed completely how they see the world, whether they have love or hate in their heart to understand their stories.

[1334] I've learned a lot on the human side of things by having talked to a lot of people there, but it has been on the Ukrainian side for me currently.

[1335] Traveling to the Russian side is more difficult.

[1336] Let me ask you about your now friend?

[1337] Can we go as far as it says friend in Sabu?

[1338] Hector Monsagir.

[1339] What's the story?

[1340] What's your long story with him?

[1341] Can you tell me about what is Lalsek?

[1342] Who is Sabu?

[1343] And who's anonymous?

[1344] What is anonymous?

[1345] Where's the right place to start that story?

[1346] Probably anonymous.

[1347] Anonymous is a, it still is, I guess, a decentralized organization.

[1348] They call themselves headless, but once you look into them a little ways, they're not really headless.

[1349] um the the the power struggle comes with whoever has a hacking ability um that might be you're a good hacker or you have a giant botan that used for dedos um so so you're going to wield more power if you can control where it goes anonymous started doing their like hacktivism stuff in 2010 or so um the word hack was in the media all the time then um and then right around then there was a federal contractor named H .B. Gary Federal.

[1350] Their CEO is Aaron Barr.

[1351] And Aaron Barr said he was going to come out and de -anonymize Anonymous.

[1352] He's going to come out and talk at Black Hat or DefCon or one of those and say, you know, who they are.

[1353] He figured it out or so he figured it out by based on, you know, when people were online, when people were in IRC, when tweets came out, it was, there was no scientific proof behind it or anything.

[1354] So he's just going to falsely name people that were in anonymous.

[1355] So Anonymous went on the attack.

[1356] They went and hacked in HBO federal, and they turned his life upside down.

[1357] They took over his Twitter account and all that stuff pretty quickly.

[1358] I have very mixed feelings about all of this.

[1359] I get, like, part of me admires the positive side of the hacktivism.

[1360] Okay.

[1361] Is there no room for admiration there of the fuck you to the man?

[1362] Not at the time.

[1363] Again, it was a violation.

[1364] 18 USC 1030 So it was my job So at the time no In retrospect sure Yeah But what was the philosophy of the hacktivism Was it The philosophically were they at least Expressing it for the good of humanity Or no They outwardly said that they were going to go after people That they thought were corrupt So they were judging jury on corruption They were to go after it Once you get inside And realize what they were doing They were going after people that they had an opportunity to go after.

[1365] So maybe someone had a zero day and then they searched for servers running that zero day and then from there, let's find a target.

[1366] I mean, one time they went after a toilet paper company.

[1367] I still don't understand what that toilet paper company did, but it was an opportunity to make a splash.

[1368] Is there some way for the joke, for the lulls?

[1369] It developed into that.

[1370] So I think the hacktivism and the anonymous stuff wasn't so much for the lulls.

[1371] But from that, H .B .G .G .G. then there were six guys that worked well together and they formed a crew, a hacking crew and they kind of split off into their own private channels and that was Lulsec or laughing at your security was their motto.

[1372] So that's L -U -L -Z -E -C, Lull -S -E -C.

[1373] Of course it is.

[1374] Lul -S -S -C.

[1375] And who founded that organization?

[1376] So Kayla and SABU were the hackers of the group and so they really did all the work on H .B. Gary.

[1377] So these are code names.

[1378] Yeah.

[1379] It's their online names.

[1380] They're, they're Knicks.

[1381] And so, you know, they, that's all they knew each other as.

[1382] You know, they talked as those names.

[1383] And they worked well together.

[1384] And so they formed a hacking crew.

[1385] And that's when they started the, the, at first they didn't name it this, but it was the 50 days of lulls where they would just release major, major breaches.

[1386] And it stirred up the media.

[1387] I mean, it put hacking in on, in the media every day.

[1388] They had 400 or 500 ,000 Twitter followers.

[1389] You know, and it was kind of interesting.

[1390] But then they started swinging at the Beehive, and they took out on some FBI -affiliated sites.

[1391] And then they started Fuck FBI Fridays, where every Friday they would release something.

[1392] And we waited it with bated breath.

[1393] I mean, they had us Hoclin and Sinkery pissed.

[1394] We were waiting to see what was going to be dropped every Friday.

[1395] here.

[1396] It was a little embarrassing looking back on it now.

[1397] And this is in the early 2010s.

[1398] Yeah.

[1399] This was 2010 -2011 around there.

[1400] So I actually linger on anonymous.

[1401] What, do you still understand what the heck is anonymous?

[1402] It's just a place where you hang out.

[1403] I mean, it's just it started on 4chan, went to 8chan, and it's really just anyone.

[1404] You could be an anonymous right now if you wanted to.

[1405] Just you're in there hanging out in the channel.

[1406] Now, you're probably not going to get much credit until you work your way up and prove who you are.

[1407] Someone vouches for you.

[1408] but anybody can be an anonymous.

[1409] Anybody can leave anonymous.

[1410] What's the leadership of anonymous?

[1411] Do you have a sense that there is a leadership?

[1412] There's a power play.

[1413] Now, there's not someone that, you know, that says this is what we're doing, all we're doing.

[1414] I love the philosophical and the technical aspect of all of this, but I think there is a slippery slope to where for the laws, you can actually really hurt people.

[1415] That's the terrifying thing.

[1416] When you attach, I'm actually really terrified of the power of the lull.

[1417] The fun thing somehow becomes a slippery slope.

[1418] I haven't quite understood the dynamics of that, but even in myself, if you just have fun with the thing, you lose track of the ethical grounding of the thing.

[1419] And so, like, it feels like hacking for fun can just turn it, like, literally lead to nuclear war.

[1420] Like, literally destabilize yeah, yada, yada, nuclear war.

[1421] I could see it.

[1422] Yeah.

[1423] So I've been more careful with the lull.

[1424] Yeah, I've been more careful about that.

[1425] And I wonder about it because in Internet speak, somehow ethics can be put aside through the slippery slope of language.

[1426] I don't know.

[1427] Everything becomes a joke.

[1428] If everything's a joke, then everything's allowed and everything's allowed, then you don't have a sense of what is right and wrong.

[1429] You lose sense of what is right and wrong.

[1430] You still have victims.

[1431] I mean, you're laughing at some someone.

[1432] Someone's the butt of this joke.

[1433] You know, whether it's major corporations or the individuals, I mean, some of the stuff they did was just, you know, relacing people's PII and their personal identifying information and stuff like that.

[1434] I mean, is it a big deal?

[1435] I don't know.

[1436] Maybe, maybe not.

[1437] But, you know, if you could choose to not have your information put out there, probably wouldn't.

[1438] We do have a sense of what anonymous is today.

[1439] Has it ever been one stable organization, or is it a collection of hackers that kind of emerge for particular tasks, for particular, like, hacktivism tasks and that kind of stuff?

[1440] It's a collection of people that has some hackers in it.

[1441] There's not a lot of big hackers in it.

[1442] I mean, there's something that will come bounce in and bounce out.

[1443] Even back then, there's probably just as many reporters in it, people of the media in it with the hackers at the time just trying to get the inside scoop on things you know some giving the inside scoop you know we arrested a reporter that gave over the username and password to his newspaper and you know just so he could break the story he trusted him speaking of trust reporters boy there's good ones there's good ones there are there are but boy do i have a complicated relationship with them how many stories about you are completely true.

[1444] You can just make stuff up on the internet.

[1445] And one of the things that, I mean, there's so many fascinating psychological, sociological elements of the internet to me, one of them is that you can say that Lex is a lizard, right?

[1446] And if it's not funny, so lizard is kind of funny.

[1447] What should we say?

[1448] Lex has admitted to being an agent, of the FBI, okay?

[1449] You can just say that, right?

[1450] And then the response that the internet would be like, oh, is that true?

[1451] I didn't realize that.

[1452] They won't go, like, provide evidence, please.

[1453] Right?

[1454] They'll just say, like, oh, that's weird.

[1455] I kind of thought he might be kind of weird.

[1456] And then it piles on.

[1457] It's like, hey, hey, hey, guys.

[1458] Like, here's a random dude on the internet just said a random thing.

[1459] You can't just, like, pile up as, and then.

[1460] Johnny 6969 is now a source that says.

[1461] And then, like, the thing is, I'm a tiny guy, but when it grows, if you're, like, have a big platform, I feel like, newspapers will pick that up, and then they'll, like, start to build on a story.

[1462] And you never know where that story really started.

[1463] It's so cool.

[1464] I mean, to me, actually, honestly, it's kind of cool that there's a viral nature of the internet that can just fabricate truth completely.

[1465] I think we have to accept that new reality and try to deal with it somehow.

[1466] You can't just, like, complain that Johnny Six.

[1467] I can start a random thing, but I think in the best possible world, it is the role of the journalist to be the adult in the room and put a stop to it versus look for the sexiest story so that there could be clickbait that can generate money.

[1468] Journalism should be about sort of slowing things down, thinking deeply through what is true or not, and showing that to the world.

[1469] I think there's a lot of hunger for that, and I think that would actually get the clicks in the end.

[1470] I mean, it's that same pressure I think we're talking about with the FBI and with the tech companies about control us.

[1471] I mean, the editors have to please and get those clicks.

[1472] I mean, they're measured by those clicks.

[1473] So, you know, I'm sure the journalists, the true journalists, the good ones out there want that, but they want to stay employed too.

[1474] Can I actually ask you really as another tangent, the Jared and others, they're doing undercover.

[1475] In terms of the tools you have for catching cyber security criminals, how much of it's undercover?

[1476] Undercover is a high bar to jump over.

[1477] You have to do a lot to start an undercover in the FBI.

[1478] There's a lot of thresholds.

[1479] So it's not your first investigative tool step.

[1480] You have to identify a problem and then so that the lower steps can't get you there.

[1481] But I mean, I think we we had an undercover going on the squad about all times when one was being shut down or taken down we were spinning up another one so it's a good tool to have you know and utilize there are a lot of work i don't think if you run one you'll never run another one in your life oh so it's like psychologically is it there's a lot of work just technically but also psychologically like you have to really it's 24 -7 you're inside that world like you have to know what's going on and what's happening You, you know, you're taking on, you have to remember who you are when you're, because you're a criminal online.

[1482] You have to go to a special school for it, too.

[1483] Was that ever something compelling to you?

[1484] I went through the school, but I'm a pretty open and honest guy, and so it's tough for me to build that wall of lies.

[1485] Maybe I'm just not smart enough to keep all the lies straight.

[1486] Yeah, but a guy who's good at building up a wall of lies would say that exact same.

[1487] Exactly.

[1488] Yeah, it's so annoying the way truth works in this world.

[1489] It's like, people have told me, like, because I'm trying to be honest and transparent, that's exactly what an agent would do, right?

[1490] But I feel like an agent would not wear a suit and tie.

[1491] I wore a suit and tie every day.

[1492] I was a suit and tie guy.

[1493] You were?

[1494] Yeah, every day.

[1495] I remember one time I wore shorts in and the SAC came in.

[1496] And this was when I was a rock star at the time in the bureau, and I had shorts in.

[1497] And I said, sorry, ma 'am, I apologize for my attire.

[1498] And she goes, you could wear a bike shorts in here.

[1499] here, I wouldn't care.

[1500] I was like, oh, shit.

[1501] That sounds nice.

[1502] I never wore the bike shorts, but yeah.

[1503] But I see, I don't, I see a suit and ties constraining.

[1504] I think it's, it's liberating in sorts.

[1505] It's like shows that you're taking the moment seriously.

[1506] Well, not just that people wanted it.

[1507] I mean, people expected when you, you are dressed like a perfect FBI agent.

[1508] When someone knocks in their door, that's what they want to see.

[1509] They want to see what Hollywood built up is what an FBI agent is.

[1510] You show up like my friend Ilwan.

[1511] He was dressed always in T -shirts and shorts.

[1512] People aren't going to take him serious.

[1513] They're not going to give them what they want.

[1514] I wonder how many police I can just show up and, like, say I'm from the FBI and start interrogating them.

[1515] Like at a bar.

[1516] Probably.

[1517] Oh, definitely.

[1518] If they've had a few drinks, you can definitely.

[1519] Well, but people are going to recognize you.

[1520] That's the only problem.

[1521] That's another thing.

[1522] You start taking out of big cases.

[1523] You can't work cases anymore in the FBI.

[1524] Your face gets out there.

[1525] Your name, too?

[1526] Well, actually, let me ask you about that before we return to our friend Sabu.

[1527] Okay.

[1528] um you've you've tracked and worked on some of the most dangerous people in this world um have you ever feared for your life so i had to make a really really shitty phone call one time um i was sitting in the bureau and this was right after silk road um and jared called me he was back in chicago and he called me and said hey your name and your kid's name are on a website for an assassination.

[1529] They're paying to have you guys killed.

[1530] Now, these things happen on the black market.

[1531] They come up, you know, and people debate whether they're real or not.

[1532] But we have to take it serious.

[1533] Someone's paying to have me killed me. So I had to call my wife and we have a word in that if I said this word, and we only said it one time to each other, if I said this, this is serious.

[1534] Drop what you're doing and get to the kids.

[1535] And so, I had to drop the word to her.

[1536] And I could feel the breath come out of her because she thought her kids were in danger.

[1537] At the time, they were.

[1538] I wasn't in a state of mind to drive myself.

[1539] So an agent on the squad and a girl named Evealina, she drove me, lights and sirens all the way to my kids' school.

[1540] And we had locked, I called the school.

[1541] We were in a lockdown.

[1542] Nobody should get in or out, especially someone with a gun.

[1543] The first thing they did was let me in the building with a gun.

[1544] So I was a little disappointed with that.

[1545] My kids were, I think, kindergarten in fifth grade or somewhere around there.

[1546] Maybe they're closer or second.

[1547] I'm not sure where.

[1548] But all hell broke loose.

[1549] And we had to, from there, go move into a safe house.

[1550] I live in New York City.

[1551] NYPD surrounded my house.

[1552] The FBI put cameras outside my house.

[1553] You couldn't drive in my neighborhood without your license plate being read.

[1554] Hey, why is this person here?

[1555] why is that person there.

[1556] I got to watch my house on an iPad while I sat at my desk.

[1557] But, you know, again, I put my family through that, and it scared the shit out of them.

[1558] And that's, to be honest, I think that's sort of, my mother -in -law's words were, I thought you did cyber crime.

[1559] And because during Silk Road, I didn't tell my family what I was working on.

[1560] I don't talk about that.

[1561] I want to escape that.

[1562] I don't want to be there, you know.

[1563] I remember that, like, so when I was in the FBI, like, driving in, I used to go in at 4 .30 every morning because I used to go to the gym before I go to the desk.

[1564] I'd be at the desk at seven, so in the gym at five, a couple hours and then go.

[1565] The best time I had was that drive -in in the morning where I could just be myself.

[1566] I listened to a sports podcast out of D .C. And we talked about sports and the nationals and whatever it was, the capitals.

[1567] It was great to not think about Silk Road for 10 minutes so but that was my best time but but yeah again so yeah i i've had that move into the safe house i left my mp5 at home that's the the bureau's machine gun um showed my wife to just pull pull and spray so uh but how often did you live or work and live with fear in your heart it was only that time i mean for actual physical security um then i mean after the anonymous stuff i you know, I really tightened down to my cyber security.

[1568] You know, I don't have social media.

[1569] I don't have pictures of me and my kids online.

[1570] I don't really, if I go to a wedding or something, I say, I don't take my picture with my kids, you know, if you're going to post it someplace or something like that.

[1571] So that sort of security I have.

[1572] But, you know, just like everybody, you start to relax a little bit and security breaks down because it's not convenient.

[1573] But it's also part of your job.

[1574] So you're much better.

[1575] at, like me, your job now and your job before, so you're probably much better at taking care of the low -hanging fruit, at least.

[1576] I understand the threat, and I think that's what a lot of people don't understand, is understanding what the threat against them is.

[1577] So I'm aware of that, what possibly, and I think about it, you know, I think about things.

[1578] I do remember, so you tripped a memory in my mind, I remember a lot of times, and I had a gun on my hip, I still carry a gun to this day, opening my front door and being concerned what was on the other side, walking out of the house because I couldn't see it.

[1579] I remember those four o 'clock heading to the car, I was literally scared.

[1580] Yeah.

[1581] I mean, having seen some of the things you've seen, it makes you perhaps question how much evil there is out there in the world.

[1582] How many dangerous people that are.

[1583] They're out there.

[1584] Crazy people even.

[1585] There's a lot of crazy.

[1586] There's a lot of evil.

[1587] Most people, I think, get into cybercrime or just opportunistic.

[1588] Not necessarily evil.

[1589] They don't really know, maybe think about the victim.

[1590] It's a crime of opportunity.

[1591] I don't label that as evil.

[1592] And one of the things about America that I'm also very happy about is that rule of law, despite everything we talk about, it's tough to be a criminal in the United States.

[1593] So, like, if you walk outside of your house, you're much safer than you are in most other places in the world.

[1594] You're safer, and the system's tougher.

[1595] I mean, Lollsec, six guys, one guy in the United States, five guys, other places.

[1596] Hector was facing 125 years.

[1597] Those guys got slaps on the wrist and went back to college.

[1598] You know, different laws, different places.

[1599] So who's Hector?

[1600] Tell me the story, Hector.

[1601] So this LLSEC organization was started.

[1602] So Hector was before that, he was in part anonymous.

[1603] He was doing all kinds of hacking stuff, but then he launched LLSEC.

[1604] He's an old -school hacker.

[1605] I mean, he learned how to hack, and I don't want to tell his story, but he learned to hack because he grew up in the Lower East Side of New York and picked up some NYPD computers that were left on the sidewalk for trash, taught himself how to...

[1606] He doesn't exactly look like a hacker.

[1607] For people who don't know, He looks, I don't know exactly what he looks like, but it's not like a technical, not what you would imagine.

[1608] But perhaps that's a Hollywood portrayal.

[1609] Yeah, I think you get in trouble these days saying what a hacker looks like.

[1610] I don't know if they have a traditional look.

[1611] Just like I said, Hollywood has an idea, an FBI looks like.

[1612] I don't think you can do that anymore.

[1613] I don't think you can say that anymore.

[1614] Well, he certainly has a big personality and charisma and all that kind of stuff.

[1615] That's Saboo.

[1616] I can see him selling me anything.

[1617] That's Saboo.

[1618] That's convincing me of anything.

[1619] You know, there's two different people.

[1620] There's Sabu and there's Hector.

[1621] Hector is a sweet guy.

[1622] He likes to have intellectual conversations, and, like, that's just a thing.

[1623] He'd rather, you know, just sit there and have a one -on -one conversation with you.

[1624] But Sabu, that's a rous motherfucker.

[1625] And you first met Sabu?

[1626] I was tracking Sabu.

[1627] That's all I knew was Sabu.

[1628] I didn't know, Hector.

[1629] So when did your paths cross?

[1630] in terms of tracking.

[1631] When did you first take on the case?

[1632] The spring of 11.

[1633] So it was through anonymous.

[1634] Through anonymous.

[1635] Well, really kind of Lulsec.

[1636] We were, Lulsec was a big thing, and it was pushed out to all the cyber, you know, 56 field offices in the FBI.

[1637] Most of them have cyber squads or cyber units.

[1638] And so, you know, it was being pushed out there.

[1639] And it was in the news every day, but it really wasn't ours.

[1640] So we didn't have a lot of victims and our.

[1641] area of responsibility.

[1642] And so we just kind of pay attention to it.

[1643] Then I got a tip that a local hacker in New York had broken into AOL.

[1644] And so Olivia Olson and I, she's another agent who she's still in.

[1645] She's a supervisor out in L .A. She's a great agent.

[1646] We went all around New York looking for this kid just to see what we can find and ended up out in Staten Island at his grandmother's house.

[1647] She didn't know where he was, obviously.

[1648] Why would she?

[1649] but I left my card he gave me a call that night and started talking to me and I said let's just meet up tomorrow at the McDonald's across the 26 Fed and he came in and three of us sat there and talked and you know gave me a stuff he started telling me about all the felonies he was committing those days including that break in day well and then he finally says you know I can give you Saboo and Saboo to us was the Kaiser Associate Vacan he was our guy you know he was the guy that was in the news that was pissing us off so so he was part of the fbbi fridays saboo was yeah oh he led it yeah he was the leader of fuck fbi fridays so yeah what was one of the more memorable uh f the triple f's i said what how do you get you get you why how and why do you go after the beehive that's kind of intense you get you on the news it gets you It's the lulls.

[1650] It's funnier to go after the big ones.

[1651] You know, and they weren't getting, like, real FBI.

[1652] They weren't breaking into FBI mainframes or anything, but they, you know, they were, you know, affiliate sites or anything that I have to do.

[1653] A lot of law enforcement stuff was coming out.

[1654] So, but, you know, we looked back.

[1655] And so if this kid knew that Saboo, maybe there was a chance we used him to lure Sabu out.

[1656] But we also said, well, maybe this kid knows Saboo in real life.

[1657] And so we went and looked through the IP.

[1658] and 10 million IPs, we find one, and it blogged to him.

[1659] And so that day, Sabu, someone had doxed Sabu, and we were a little afraid he was going to be on the run.

[1660] We had a surveillance team, and FBI surveillance teams are awesome.

[1661] Like, you cannot even tell their FBI agents.

[1662] It's, they are really that good.

[1663] I mean, there's baby strollers and all, whatever you wouldn't expect an FBI agent to have.

[1664] So that's a little like the movies.

[1665] A little bit, yeah, I mean, it is true.

[1666] but they fit into the area.

[1667] So now they're on the Lower East Side, which is, you know, a baby stroller might not fit in there as well.

[1668] You know, somebody's laying on the ground or something like that.

[1669] They really get, play the character and get into it.

[1670] So now I can never trust a baby stroller again.

[1671] Yeah, well, probably shouldn't.

[1672] Every baby, I'm just like, look at, stare at them suspiciously.

[1673] Especially if the moms were in cargo pants while she pushes it.

[1674] Yeah, so if it's like a very stereotypical mom, stereotypical baby, I'm going to be very suspicious.

[1675] I'm going to question the baby.

[1676] A baby's wired.

[1677] Be careful.

[1678] You know, we raced out there and, like, our squad's not even full.

[1679] There's only a few guys there.

[1680] And like I said, I was a suit guy, but that day I had shorts and a t -shirt on.

[1681] I had a white t -shirt on.

[1682] And I only bring it up because Sabu makes fun of me to this day.

[1683] So I had a bulletproof vest and a white t -shirt on.

[1684] And that was it.

[1685] I'd shorts, too, and all that.

[1686] But raced over to there.

[1687] We didn't have any equipment.

[1688] We brought our boss's boss's boss.

[1689] He stopped off at NYPD, got us like a ballistic shield.

[1690] and a battering ram if we needed it.

[1691] And then we get to Hector's house, Sabu's house, and he's on the six floor.

[1692] And so normally, you know, we're the cyber dork squad.

[1693] We'll hop in the elevator.

[1694] Six floors is a long ways to go up and bulletproof vests and a ballistic shield.

[1695] But we had been caught in an elevator before on a search, so we didn't.

[1696] Took the stairs.

[1697] We get to the top, a tad winded, but knocking the door in this big, towering guy, opens the door just slightly, and he sees the green vest with big yellow letters, FBI, and he steps outside.

[1698] Can I help you?

[1699] You know, and tries to social engineer us.

[1700] But eventually, we get our way inside the house.

[1701] You know, I notice a few things that are kind of out of place.

[1702] There's a laptop charger and a flashing modem.

[1703] And I said, do you have a computer here?

[1704] And he said, no, there's no computer here.

[1705] So we knew the truth.

[1706] and then the half lies and all that sort of thing.

[1707] So it took us about another two hours and finally he gave up that he was Saboo.

[1708] He was the guy we were looking for.

[1709] So we sat there and we kind of showed him sort of the evidence we had against him.

[1710] And from his words, we sat there and talked like two grown adults.

[1711] And, you know, I gave him the options.

[1712] And he said, well, let's talk about working together.

[1713] So he chose to become an informant.

[1714] I don't think he chose that night, but that's where it kind of went to.

[1715] So we brought him down to the FBI that night, which was a funny trip because I'm sitting in the back seat of the car with him.

[1716] And I was getting calls from all over the U .S. from different FBI agents saying that we arrested the wrong guy.

[1717] I was like, I don't think so.

[1718] And they're like, why do you think so?

[1719] I was like, because he says it's him.

[1720] And they still said, nope, the wrong guy.

[1721] So I said, well, we'll see how it plays out.

[1722] That's so interesting because it's such a strange world.

[1723] It's such a strange world because it's tough to, because you still have to prove it's the same guy, right?

[1724] Because the anonymity.

[1725] Yeah, I mean, we had his laptop by that point.

[1726] Yeah, I know.

[1727] Him saying that helped.

[1728] I get him a clue in my world.

[1729] Yeah, yeah.

[1730] But yeah, if he would have fought it, I mean, that definitely would have come in as evidence that ever if your agents are saying it's not him, you have to disclose that stuff.

[1731] So you had a lot of stuff on him.

[1732] what was he facing if he was facing 125 years 125 years in prison that's that not that's if you took every charge we had against him and put him you know consecutively no no one ever gets charged that but yeah he had essentially it would have been 195 years you know fast forward to the end he got thanked by the judge for his service after nine months and he walked out of the court of free man but that's being while being an informant yes well so the word informant here really isn't that good it's not fitting that technically i guess that's what he was but he didn't know the other people it was all an odd he knew nicks and all that um he really gave us the inside of what was happening in the hacker world like i said he was an old school hacker he was back when hackers didn't work together with Anonymous he was down a cult of dead cow and those type guys like way back and he was around for that he's like an encyclopedia of hacking but you know we just like his prime was in the 90s for terror hack but yeah he kind of came back when Anonymous started going after MasterCard and PayPal and all that do the WikiLeaks stuff but even even that little interaction being an informant he probably made a lot of enemies how do you protect a guy like that?

[1733] He made enemies after it was revealed.

[1734] How does the FBI protect him?

[1735] Good luck.

[1736] I mean, perhaps I'll talk to him one day.

[1737] But is that guy afraid for his life?

[1738] Again, I think...

[1739] It doesn't seem like it.

[1740] He has very good security for himself, cybersecurity.

[1741] But, you know, he doesn't like the negative thing said about him online.

[1742] I don't think anybody does.

[1743] But, you know, I think it's...

[1744] so many years of the internet kind of bitching at you and all that, you get callous to, it's just internet bitching.

[1745] And also the hacking world moves on very quickly.

[1746] He is kind of, like they have their own wars to fight now, and he's not part of those wars anymore.

[1747] There's still people out there that bitch and moan about him, but yeah, I think it's less.

[1748] I think, you know, and he has a good message out there of, you know, he, he, he, trying to keep kids from making the same mistakes he made.

[1749] He tries to really preach that.

[1750] How do people get into this line of work?

[1751] Is there all kinds of ways being not your line of work, his line of work?

[1752] Just all the stories you've seen of people that are in anonymous and Lalsak and Silk Road and all the cybercriminals you've interacted with.

[1753] What's the profile of a cybercriminal?

[1754] I don't think there's a profile anymore.

[1755] You know, I used to be able to say, you know, the kid in your mom's basement or something like that, but it's not true anymore, you know, like, it's, it's wide.

[1756] It's like, I've arrested, I've arrested people that you wouldn't expect would be cyber criminals.

[1757] And it's in the United States, it's international, it's everything.

[1758] Oh, it's international.

[1759] I mean, we're seeing a lot of the big hackers now.

[1760] The big arrests for hackers in England, surprisingly, you know, there's, you know, you're not going to see there's a lot of good hackers like down in Brazil.

[1761] but I don't think Brazil law enforcement is as good to hunt them down, so you're not going to see the bigger arrests.

[1762] How much state -sponsored cyber attacks are there, do you think?

[1763] More than you can imagine.

[1764] What do you want to say an attack?

[1765] A successful attack or just a probing?

[1766] Probing for information, just like feeling, you know, testing that there's where the attack factors are, trying to collect all the possible attack.

[1767] Put a Windows 7 machine on the internet forward face in and put a packet sniffer on there and look at where the traffic comes from.

[1768] I mean, in 24 hours, you were going to fill up a hard drive with packets just coming at it.

[1769] Yeah.

[1770] I mean, it's not hard to know.

[1771] I mean, it's just constantly probing for entry points into things.

[1772] You know, you could go mad putting up honeypot.

[1773] Draws in intrusion.

[1774] I see what methodology is.

[1775] Just to see what's out there.

[1776] Yeah, and it doesn't go anywhere.

[1777] It maybe has fake information and stuff like that.

[1778] You know, it's kind of to see.

[1779] what's going on and judge what's happening on the internet, get a, you know, lick your finger and test the wind of what's happening these days.

[1780] The funny thing about, like, because I'm at MIT, that attracted even more attention for the, not for the lulls, but for the technical challenge.

[1781] It seems like people enjoy hacking MIT.

[1782] It's just the amount of traffic MIT was getting for that in terms of just the sheer number of attacks from different places.

[1783] It's crazy.

[1784] Yeah, like, just like that, putting up a machine, seeing what comes.

[1785] NASA used to be the golden ring now.

[1786] everybody got NASA.

[1787] That was like the early 90s.

[1788] If you could hack NASA, that was the now.

[1789] Yeah, MIT is a big one.

[1790] Yeah, it's fun.

[1791] It's fun to see.

[1792] Respect.

[1793] Because I think in that case, it comes from a somewhat good place because, you know, they're not getting any money from MIT.

[1794] It's more for the challenge.

[1795] Let me ask you about that, about this world of cybersecurity.

[1796] How big of a threat are cyber attacks for companies and for individuals?

[1797] like let's lay out where are we in this world what's out there it's the wild wild west and it's it's it's i mean people want the idea of security but it's inconvenient so they don't they push back on it um and there are a lot of opportunistic nation state financially motivated hackers hackers for the lulls you got three different tiers there um and they're they're on the prow they have tools they have really good tools that are being used against us and at what scale so when you're thinking of i don't know what's let's talk about companies first so say you're you're talking to a mid -tier i wonder what the most interesting businesses so google let's we can look at large tech companies or we can look at medium -sized tech companies and like you were sitting in a room with a CTO, with the CEO, and the question is, how fucked are we?

[1798] And what should we do?

[1799] What's the low -hanging food?

[1800] What are the different strategies and those companies should consider?

[1801] I mean, the problem is they want a push button.

[1802] They want to, out -of -the -box solution that I'm secure, you know, they want to tell the people they're secure, but.

[1803] And that's very challenging to have.

[1804] It's impossible.

[1805] If I could, if someone had it, they'd be a billionaire.

[1806] You know, they'd be beyond a billionaire, you know, because that's what everybody wants.

[1807] So, you know, you can buy all the tools you want.

[1808] It's configuring them the proper way.

[1809] And there's, if anyone's trying to tell you that there's one solution that fits all, there's stankhole salesmen.

[1810] And there's a lot of people in cybersecurity that are stankhole salesmen.

[1811] Yeah, and I feel like there's tools, if they're not configured correctly, they just introduce, they don't increase security significantly, and they introduce a lot of pain for the people.

[1812] They decrease efficiency of the actual work.

[1813] you have to do.

[1814] So, like, we had, I was at Google for a time, and I think mostly I want to give props to their security efforts, but user data, so like data that belongs to users is like the holy, like, the amount of security they have around that is incredible.

[1815] So most, any time I had to work with anything even resembling user days.

[1816] I never got a chance to work with actual user data.

[1817] Anything resembling that, first of all, you have no access to the internet.

[1818] It's impossible to even come close to the access of the internet.

[1819] And there's so much pain to actually, like, interact with that data.

[1820] I mean, it was extremely inefficient.

[1821] In places where I thought it didn't have to be that inefficient, the security was too much.

[1822] But I have to give respect to that because you, in that case, you want to err on the side of security, but that's Google.

[1823] They were doing a good job of this.

[1824] The reputational harm, if it got out, I mean, Google, you know, why is Google drive -free, you know, because they want your data.

[1825] They want you to park your data there.

[1826] So, you know, if they got hacked or leaked information, the reputational harm would be tremendous.

[1827] But, you know, for a company that's not, it's really hard to do that, right?

[1828] And the company's not as big as Google or not as tech -savvy as Google might have a lot of trouble with doing that kind of stuff.

[1829] Instead of increasing security, they'll just decrease the efficiency.

[1830] Well, yeah, so there's a big difference between IT and security.

[1831] And unfortunately, these midside companies, they try to stack security into their IT department.

[1832] Your IT department is about business continuity.

[1833] They're about trying to move business forward.

[1834] They want your users to get the data they need to do their job so the company can grow.

[1835] Security is not that.

[1836] They don't want you to get the data.

[1837] They, you know, but there's fine tuning you can do to, you know, ensure that.

[1838] I mean, it's simple as like having good onboarding procedures for employees.

[1839] Like, like, you come into my company, you don't need access to everything.

[1840] Maybe you need access to something for one day.

[1841] Turn the access on.

[1842] Don't leave it on.

[1843] I mean, I was the victim of the OPM hack, the Office of Personnel Management, because old credentials from a third -party vendor were sitting there inactive.

[1844] And the Chinese government found those credentials and were able to log in and steal all my information.

[1845] So a lot could be helped if you just control the credentials.

[1846] the access, the access control, how long they last, and people who have, who need access to a certain thing, only get access to that thing and not, nothing else, and then just gets refreshed like that.

[1847] Access control, you know, like we said, setting up people leaving people, leaving the company, get rid of their, they don't need control.

[1848] Two -factor authentication, you know, that's a big thing, you know, we, it's, I mean, I sound like a broken record because this isn't not anything new.

[1849] This isn't rocket science.

[1850] The problem is we're not implementing it.

[1851] If we are, we're not doing it correctly, because these guys are taking us.

[1852] Well, two -factor authentication is a good example of something that I just was annoyed by for the longest time because, yes, it's very good, but like it seems that it's pretty easy to implement horribly to where it's like, it's not convenient at all for the legitimate user to use.

[1853] It should be trivial to do, like to authenticate yourself twice.

[1854] It should be super easy.

[1855] If security, if it's slightly inconvenient for you, it's thinking about how inconvenient it is for a hacker and how they're just going to move on to the next person.

[1856] Yes, yes.

[1857] In theory, when implemented extremely well.

[1858] Yeah.

[1859] But I just don't think so.

[1860] I think actually if it's inconvenient, it shows that system hasn't been thought through a lot.

[1861] Do you know why we need two -factor authentication?

[1862] People using the same password across the same site.

[1863] So when one site is compromised, people just take that username and password.

[1864] password, it's called credential stuffing and just stuffed across the internet.

[1865] So if 10 years ago when we told everybody, don't use the same fucking password across the internet, across the funnable sites, maybe two -factor wouldn't be needed.

[1866] Yeah, so you wouldn't need two -factor if everyone did good job with passwords.

[1867] Yeah.

[1868] Right.

[1869] But I'm saying like the two -factor authentication, it should be super easy to authenticate myself with some other device really quickly.

[1870] like there should be it should be frictionless like you just hit okay okay and anything that belongs to me yeah and like i should it should very importantly be easy to set up what belongs to me uh i don't know the full complexity of the cyber attacks these platforms are under they're probably under insane amount of attacks yeah you've got it right there that people have no idea these large companies how often they're attacked you know on a per second basis and then have to fight all that off and pick out the good traffic in there.

[1871] So yeah, I would, there's no way I'd want to run a large tech company.

[1872] What about protecting individuals for individuals?

[1873] What's good advice for to try to protect yourself from this increasingly dangerous world of cyber attacks?

[1874] Again, educate yourself that you understand that there is a threat.

[1875] First, you have to realize that.

[1876] Then you're going to step up and you're going to do stuff a little bit more.

[1877] sometimes i guess i think i take that to a little bit extreme i remember one time uh uh my mom called me and she was uh screaming that uh i woke up this morning and i just clicked on a link and now my phone is making weird noises and i was like throw your phone in a glass of water just put it in a glass of water right now and she's i made my mom cry it was not a pleasant thing um so sometimes i go to a little extremes on those ones but but understanding is a risk and making it a little bit more a little more difficult to become a victim.

[1878] I mean, just understanding certain things.

[1879] You know, simple things like, you know, as we add more internet of the things to people's houses, I mean, how many Wi -Fi networks do people have?

[1880] It's normally just one, and you're bumping your phones and giving your password to be able to come to visit.

[1881] Set up a guest network.

[1882] Set up something you can change every 30 days.

[1883] Simple little things like that.

[1884] You know, I hate to remind you, but change your passwords.

[1885] I mean, I feel like I'm a broken record again.

[1886] But just make it more difficult for others to victimize you.

[1887] And then don't use the same password everywhere.

[1888] That, that, yes.

[1889] I mean, I still know people that do that.

[1890] I mean, ask .fm got popped last week, two weeks ago.

[1891] And that's 350 million username and passwords with connected Twitter accounts, Google accounts, you know, all the different social media accounts.

[1892] You know, that is a treasure trove for the next two and a half, three years of just using those credentials everywhere.

[1893] Using, you'll learn, even if it's not the right password.

[1894] you learn people's password styles.

[1895] You know, bad guys are making portfolios out of people.

[1896] You know, we're figuring out how people generate their passwords and kind of, you know, figuring, and then it's easier to crack their password.

[1897] You know, we're making a dossier on each person.

[1898] It's 350 million dossiers just in that one hack.

[1899] Yahoo, there was a half a billion.

[1900] So the thing a hacker would do with that is try to find all the low -hanging fruit, like have some kind of program that, yeah, it values the strength of the passwords.

[1901] and then finds the weak ones and that means that this person is probably the kind of person that would use the same password across multiple.

[1902] Or even just write a program.

[1903] Remember the Ring hack a couple a year ago?

[1904] That's all it was.

[1905] It was credential stuffing.

[1906] So Ring the security system by default had two factor but didn't turn it on.

[1907] And they also had, don't try unlimited tries to log into my account.

[1908] You can lock it out after 10.

[1909] By default, not turned on because it's not convenient for people.

[1910] You know, Ring, you know, I was like, I want people to stick these little things up and have security in their house.

[1911] But, you know, cybersecurity don't make it inconvenient.

[1912] Then people won't buy our product.

[1913] That's how they got hacked.

[1914] They want to say that it's insecure and got hacked into reputational harm right there for Ring, but they didn't.

[1915] It was just credential stuffing.

[1916] People bought username and passwords on the black market and just wrote a bot that just went through Ring and used every one of them to maybe 1 % hit.

[1917] But that's a big hit to the number of ring users.

[1918] You know, you can use also password managers to make, to make the changing of the passwords easier.

[1919] And to make, you can choose the difficulty, the number of special characters, the length of it and all that.

[1920] My favorite thing is on websites, yell at you for your password being too long or having too many special characters, or like, yeah, you're not allowed to have this special character or something.

[1921] You can only use these three special characters.

[1922] You know, do you understand how password cracking works?

[1923] if you specifically tell me which past, what special characters I can use?

[1924] I honestly just want to have a one -on -one meeting, like late at night with the engineer that program that, because that's like an intern.

[1925] I just want to have a sit -down meeting.

[1926] Yeah, I made my parents switch banks once because the security was so poor.

[1927] I was like, you just can't have money here.

[1928] But then there's also like the zero -day attacks.

[1929] I mentioned before the Q -Nep NAS that got hacked.

[1930] Luckily, I didn't have anything.

[1931] private on there, but it really woke me up to like, okay, so like, if you take everything extremely seriously.

[1932] Unfortunately, for the end users, there's nothing you can do about zero day.

[1933] It's, you know, you have no control over that.

[1934] I mean, it's a, the engineers that made the software don't even know about it.

[1935] Now, let's talk about one days.

[1936] So there's a patch now out there for the security.

[1937] So if you're not updating your systems for these security patches, if it's just not on you, my father -law has such an old iPhone you can't security patch it anymore so you know and I tell him he's like you know this is what you're missing out on this is what you're exposing yourself to because um you know we talked about that powerful tool that uh the how we found ross olberg at gmail .com well bad guys are using that too it's called you know we used to be called google dorking now it's i think it's named kind of google hacking by the community um you can go in you know and find a vulnerability, read about the white paper, what's wrong with that software, and then you can go on the internet and find all of the computers that are running that outdated software.

[1938] And there's your list, there's your target list.

[1939] Yeah.

[1940] I know the vulnerabilities that are running.

[1941] Again, not making a playbook here.

[1942] But, you know, that's how easy it is to find your targets, and that's what the bad guys are doing.

[1943] Then the reverse is tough.

[1944] It's much tougher, but it's still doable, which is like first find the target.

[1945] If you have specific targets to, you know, hack into a Twitter account, for example.

[1946] Much harder.

[1947] That's probably social engineering, right?

[1948] That's probably the best way.

[1949] Probably if you want something specific to that, I mean, if you really want to go far, you know, if you're targeting a specific person, you know, how hard is it to get into their office and put a, you know, a little device, USB device in line with their mouse, who checks how their mouse is plugged in?

[1950] And you can, for 40 bucks on the black market, you can buy a key logger that just USB, then the mouse plugs right into it.

[1951] It looks like an extension on the mouse if you can even find it.

[1952] You can buy the stuff with a mouse inside of it and just plug it into somebody's computer.

[1953] And there's a key logger that lives in there and calls home, sends everything you want.

[1954] So, I mean, and it's cheap.

[1955] Yeah.

[1956] In grad school, a program that built a bunch of key loggers.

[1957] It was fascinating, tracking mouse just for what I was doing as part of the research.

[1958] I was doing to see if by the dynamics of how you type and how you move the mouse, you can tell who the person is.

[1959] Oh, wow.

[1960] That's like, it's called the active authentication.

[1961] Basically biometrics that's not using bio to see how identifiable that is.

[1962] So it's fascinating to study that, but it's also fascinating how damn easy it's to install key loggers.

[1963] So I think is in natural what happens is you realize how many vulnerabilities there are in this world.

[1964] You do that when you understand bacteria and viruses.

[1965] You realize they're everywhere.

[1966] And the same way with the, I'm talking about biological ones.

[1967] And then you realize that all the vulnerabilities that are out there.

[1968] One of the things I've noticed quite a lot is how many people don't log out of their computers.

[1969] Just how easy physical access the systems actually is.

[1970] like in a lot of places in this world and I'm not talking about private homes I'm talking about companies especially large companies it seems quite trivial in certain places that I've been to to walk in and have physical access to the system and that's depressing to me I laugh because one of my partners at Naxo that I work at now he worked at a big company like you would know the name as soon as I told you I'm not going to say it but the guy who owned the company and the company has his name on it didn't want to ever log into a computer, just annoyed the shit out of him.

[1971] So they hired a person that stands next to his computer when he's not there.

[1972] And that's his physical security.

[1973] That's good.

[1974] That's pretty good, actually.

[1975] Yeah, I mean, I guess if you can afford to do that.

[1976] At least you're taking your security seriously.

[1977] I feel like there's a lot of people in that case would just not have a login.

[1978] Yeah.

[1979] No, the security team there had to really work around to make that work, not compliant with company policy.

[1980] but that's that's interesting the the key log is there's a lot of there's just a lot of threats yeah i mean a lot of ways to get in yeah i mean so you can't sit around and worry about someone physically gaining access to your computer with key logger and stuff like that um you know if you're traveling to a foreign country and you work for the fbi then yeah you do you pick little you know sometimes some countries you would bring a fake laptop just to see if they stole it or accessed it i really want especially in this modern day to just create a lot of clones of myself that generate lex -sounding things and just put so much information out there actually docks myself all across the world and then you're not at target i guess just put it out there i've always said that though like we do these searches and fbi houses and stuff like that if someone just got like a box load of like 10 terabyte drives and just encrypted them oh my god do you know how long the fb i would spin their wheels trying to get that data off there'd be insane oh so just give them.

[1981] You don't even know which one you're looking for.

[1982] Yeah.

[1983] That's true.

[1984] That's true.

[1985] So it's like me printing like a treasure map to a random location, just get people to go on goose chases.

[1986] Yeah.

[1987] What about operating system?

[1988] What have you found?

[1989] What's the most secure and what's the least secure operating system?

[1990] Windows, Linux.

[1991] Is there no universal?

[1992] There's no universal security.

[1993] I mean, it changed.

[1994] People used to think Macs were the most secured just because they just weren't out there.

[1995] But now kids have had access to them.

[1996] So, you know, I know you're a Linux guy.

[1997] I like Linux too.

[1998] But, you know, it's tough to run a business on Linux.

[1999] You know, people want to move more towards the Microsofts and the Googles just because it's easier to communicate with other people that maybe aren't computer guys.

[2000] So you have to just take what's best, what's easiest, and secure the shit out of it as much as you can and just think about it.

[2001] What are you doing these days in Naxo?

[2002] So we just started NXO.

[2003] So I left the government and went to a couple of consultancies.

[2004] And I started working really all the people I worked good in the government with.

[2005] I brought them out with me. And now you used to work for the man and now you're the man. Exactly.

[2006] But now we formed a partnership.

[2007] And it's just a new cybersecurity firm.

[2008] Our launch party is actually on Thursday.

[2009] So it's going to be exciting.

[2010] Do you want to give more details about the parties?

[2011] that somebody can hack into it?

[2012] No, I don't even tell you where it is.

[2013] You can come if you want, but don't bring the hackers.

[2014] Hector will be there with us up.

[2015] I can't believe you invited me because you also say insider threat is the biggest threat.

[2016] By the way, can you explain what the insider threat is?

[2017] The biggest insider threat in my life is my children.

[2018] My son's big into Minecraft and we'll download executables mindlessly and just run them on the network.

[2019] So he is...

[2020] Do you recommend against marriage and family and kids?

[2021] Nope, nope.

[2022] From a security perspective.

[2023] From a security perspective, absolutely.

[2024] But no, I just segmentation.

[2025] I mean, we do it in all businesses for years, started segmenting networks, different networks.

[2026] I just do it at home.

[2027] My kid's on his own network.

[2028] It makes it a little bit easier to see what they're doing, too.

[2029] You can monitor traffic and then also throttle bandwidth.

[2030] If your Netflix isn't playing fast enough or buffers or something.

[2031] So you can obviously change that a little too.

[2032] You know they're going to listen to this, right?

[2033] You're going to get your tricks.

[2034] Yeah, that's true.

[2035] They'll definitely will listen.

[2036] But there's nothing more humbling than your family.

[2037] You think you've done something big and you go on a big podcast and talk to Les Freeman.

[2038] They don't, they don't fucking care.

[2039] Unless you're on TikTok or you'll show up on a YouTube feed or something like that.

[2040] And I'll be like, oh, yeah.

[2041] Whatever, this guy's boring.

[2042] My son does a podcast for his school.

[2043] And I still can't get it into telling.

[2044] So one of the, Hector and I just started a podcast talking about, cybersecurity, we do a podcast called Hacker in the Fed. It just came out yesterday.

[2045] So, first episode.

[2046] Nice.

[2047] So, yeah, we got 13, 300 downloads the first day.

[2048] So pretty, we were at the top of Hacker News, which is a big website in our world.

[2049] So it's called Hacker and the Fed?

[2050] Hacker in the Fed's name of it.

[2051] Go download and listen to Hacker and the Fed. I can't wait to see what, because I don't think I've seen a video of YouTube together, so I can't wait to see what the chemistry is like.

[2052] It's not weird that you guys used to be an.

[2053] and all your friends?

[2054] So, yeah, I mean, we just did some, a trailer and all that, and our producer, we have a great producer guy named Phineas, and he kind of pulls things out of me, and I said, I said, okay, I got one.

[2055] My relationship with Hector, you know, we're very close friends now, and I's like, oh, I arrested one of my closest friends, which is a very strange relationship.

[2056] Yeah, it's weird.

[2057] You know, but he says that I changed his life.

[2058] I mean, he was going down a very dark path, and I gave him an option that one night.

[2059] and he made the right choice.

[2060] I mean, he's, he now does penetration testing.

[2061] He does a lot of good work, and, you know, he's turned his life around.

[2062] Do you worry about cyber war in the 21st century?

[2063] Absolutely.

[2064] If there is a global war, it'll start with cyber, you know, if it's not already started.

[2065] Do you feel like there's a, like a boiling, like the drums of war are beating?

[2066] What's happening in Ukraine with Russia?

[2067] It feels like the United States becoming more and more involved in the conflict in that part of the world, and China is watching very closely, is starting to get involved geopolitically and probably in terms of cyber.

[2068] Do you worry about this kind of thing happening in the next decade or two, like where it really escalates?

[2069] You know, people in the 1920s were completely terrible at predicting the World War II.

[2070] Do you think we're at the precipice of war potentially?

[2071] I think we could be.

[2072] I mean, I would hate to just be, you know, just fear -moggering out there.

[2073] You know, COVID's over, so the next big thing in the media is war and all that.

[2074] But, I mean, there's some flags going up that are very strange to me. Is there ways to avoid this?

[2075] I hope so.

[2076] I hope smarter people than I are figuring it out.

[2077] I hope people are playing their parts in talking to the right people because the war is the last thing I want.

[2078] Well, there's two things to be concerned about on the cyber side.

[2079] One is the actual defense on the technical side of cyber, and the other one is the panic that might happen when something like some dramatic event happened because of cyber, some major hack that becomes public.

[2080] I'm honestly more concerned about the panic because I feel like if people don't think about the stuff, the panic can hit harder.

[2081] Like if they're not conscious about the fact that we're constantly under attack, I feel like it'll come like a much harder surprise.

[2082] Yeah, I think people will be really shocked on things.

[2083] I mean, so we talked about LLSEC today, and LLSEC was 2011.

[2084] They had access into the water supply system of a major U .S. city.

[2085] They didn't do anything with it.

[2086] They were sitting on it in case someone got arrested and they were going to maybe just expose that it's insecure.

[2087] Maybe they were going to do something to fuck with it.

[2088] I don't know.

[2089] But, you know, that's 2011.

[2090] You know, I don't think it's gotten a lot better since then.

[2091] And there's probably nation states or major organizations that are sitting secretly on hacks like this.

[2092] 100%.

[2093] 100%.

[2094] They are sitting seriously waiting to expose I mean, I, again, I don't want to scare this shit out of people, but people have to understand the cyber threat.

[2095] I mean, there are, you know, there are thousands of nation state hackers in some countries.

[2096] I mean, we have them too.

[2097] We have offensive hackers.

[2098] You know, the terrorist attacks of 9 -11, there's planes that actually hit actual buildings, and it was visibly clear, and you can trace the information.

[2099] With cyber attacks, say something that would result in the, in a major explosion.

[2100] in New York City, how the hell do you trace that?

[2101] Like, if it's well done, it's going to be extremely difficult.

[2102] The problem is, there's so many problems.

[2103] One of which the U .S. government, in that case, has complete freedom to blame anybody they want.

[2104] True.

[2105] And then to start war with anybody, anybody that actually see, that's, sorry, that's one cynical take on it.

[2106] it, of course.

[2107] No, but you're going down the right path.

[2108] I mean, the guys that the flu planes in the building has wanted attribution.

[2109] They took credit for it.

[2110] When we see the cyber attack, I doubt we're going to see attribution.

[2111] Maybe the victim side, the U .S. government on this side might come out and try to blame somebody.

[2112] But, you know, like you've brought up, they could blame anybody they want.

[2113] There's not really a good way of verifying that.

[2114] Can I just ask for your advice?

[2115] So in my personal case, am I being tracked?

[2116] How do I know?

[2117] how do I protect myself?

[2118] Should I care?

[2119] You are being tracked.

[2120] I wouldn't say you're being tracked by the government.

[2121] You're definitely being tracked by big tech.

[2122] No, I mean, me personally, Lex, an escalated level.

[2123] So like, like you mentioned, there's an FBI file on people.

[2124] Sure.

[2125] I'd love to see what's in that file.

[2126] Who did I have the argument?

[2127] Oh, let me ask you.

[2128] FBI.

[2129] Yeah.

[2130] How's the cafeteria food in FBI?

[2131] At the Academy, it's bad.

[2132] Yeah.

[2133] What about like?

[2134] At headquarters?

[2135] Headquarters.

[2136] A little bit better because that's where the director, I mean, he eats up on the seventh floor.

[2137] Have you been like a Google?

[2138] Have you been to Silicon Valley, those cafeterias, like those?

[2139] I've been to the Google in Silicon Valley.

[2140] I've been to the Google in New York.

[2141] Yeah.

[2142] The food is incredible.

[2143] It is great.

[2144] So FBI is worse.

[2145] Well, when you're going through the academy, they don't let you outside of the building.

[2146] So you have to eat it.

[2147] And I think that's the only reason people eat it.

[2148] It's pretty bad.

[2149] I got it.

[2150] But there's also a bar inside the FBI Academy.

[2151] People don't know that.

[2152] Alcohol bar?

[2153] Yes, alcohol bar.

[2154] And as long as you've passed your PT and going well, you're allowed to go to the bar.

[2155] Nice.

[2156] It feels like if I was a hacker, I would be going after, like, celebrities because they're a little bit easier, like celebrity, celebrities, like Hollywood.

[2157] Hollywood nudes were a big thing there for a long time.

[2158] But now, yeah, I guess news is one thing.

[2159] That's what they went after.

[2160] I mean, all those guys, they socialized.

[2161] They did, they social engineered Apple to get backups, to get the recoveries for backups.

[2162] And then they just pulled all their news.

[2163] And, I mean, whole websites were dedicated to that.

[2164] Yeah.

[2165] See, I wouldn't do that kind of stuff.

[2166] It's very creepy.

[2167] I would go, if I was a hacker, I would go after, like, major, like, powerful people.

[2168] and, like, tweet something from their account and, like, something that, like, positive, like, loving.

[2169] But, like, for the walls, the obvious that it's a troll.

[2170] God, you get busted so quick.

[2171] What a bad hacker.

[2172] Really?

[2173] But why?

[2174] Because hackers never put things out about love.

[2175] Oh, God.

[2176] Oh, you mean, like, this is clearly...

[2177] Yeah, this is clearly Lex.

[2178] What the fuck?

[2179] He talks about love and every podcast he does.

[2180] I would just be like, no. Oh, God damn.

[2181] And now somebody's going to do it.

[2182] You'll blame me. It wasn't me. Looking back at your life, is there something you regret?

[2183] I'm only 44 years old.

[2184] I'm already looking back.

[2185] Is there stuff that you regret?

[2186] Evie Unit.

[2187] He got away.

[2188] So was the one that got away.

[2189] Yeah, I mean, it took me a while into my law enforcement career to learn about the compassionate side.

[2190] And it took Hector Monsaguer to make me realize that criminals aren't really criminals they're human beings, that really humanized the whole thing for me, sitting with him for nine months.

[2191] I think that's maybe why I had a lot more compassion when I arrested Ross.

[2192] Probably wouldn't have been so compassionate if it was before Hector.

[2193] But yeah, he changed my life and showed me that humanity side of things.

[2194] So would it be fair to say that all the criminals or most criminals are just people that took a wrong turn at some point?

[2195] They all have the capacity for good and for evil in them?

[2196] I'd say 99 % of the criminals that I've interacted with, yes, the people with the child exploitation, no, I don't have any place in my heart for them.

[2197] What advice would you give to people in college, people in high school, trying to figure out what they want to do with their life, how to have a life they can be proud of, how to have a career they can be proud of, all that kind of stuff?

[2198] In the U .S. budget that was just put forward, there's $18 billion for cybersecurity.

[2199] We're about a million people short of where we really should be in the industry, if not more.

[2200] If you have want job security and want to work and see exciting stuff, head towards cybersecurity.

[2201] It's a good career.

[2202] And, you know, one thing I dislike about cybersecurity right now is they expect you to come out of college and have 10 years experience in protecting and knowing every different Python script out there and everything available.

[2203] You know, the industry needs to change and let the lower people in in order to broaden and get those billion jobs filled.

[2204] But as far as their personal security, just remember it's all going to follow you.

[2205] I mean, you know, there's laws out there now that you have to turn over your social media accounts in order to have certain things.

[2206] They just changed that in New York State.

[2207] If you want to carry a gun, you have to turn over your social media to figure if you're a good social character.

[2208] um so hopefully you didn't say something strange in the last few years and it's going to follow you forever um i bet ross albrook would tell you the same thing when they not don't put ross alberg at gmail dot com on things because it's going to last forever yeah people sometimes for some for some reason they interact on social media as if they're talking to a couple of buddies uh like just shooting shit and mocking and and like um you know what is that but busting each other's chops like making fun of yourself like being uh especially gaming culture uh like people who stream thank god that's not recorded oh my god the things people say on those streams yeah but a lot of them are recorded yeah that's there's there's a whole twitch thing where people stream for many hours a day and uh i mean just outside of the very offensive things they say they just swear a lot they're not the kind of person that I would want to hire.

[2209] I want to work with.

[2210] Now, I understand that some of us might be that way privately, I guess, when you're shooting shit with friends, like playing a video game and talking shit to each other, maybe.

[2211] But, like, that's all out there.

[2212] You have to be conscious of the fact that that's all out there.

[2213] And it's just not a good look.

[2214] It's not like you're, you should, it's complicated because I'm like against hiding who you are.

[2215] for you're an asshole you should hide some of it yeah but like i just feel like it's going to be misinterpreted when you talk shit to your friends while you're playing video games it doesn't mean you're an asshole because you're an asshole to your friend but that's how a lot of friends show love yeah an outside person can't judge how i'm friends with you if i want to be this is our relationship if that person can say that i'm an asshole to them uh then that's fine i'll take it but you can't tell me i'm an asshole to them just because you saw my interaction.

[2216] I agree with that.

[2217] They'll take those words out of context and now that's considered who you are is dangerous and people take that very nonchalantly.

[2218] People treat their behavior on the internet very, very carelessly.

[2219] That's definitely something that you need to learn and take extremely seriously.

[2220] Also, I think that taking that seriously will help you figure out what you really stand for.

[2221] If you use your language carelessly, you'd never really ask, like, what do I stand for.

[2222] I feel like it's a good opportunity when you're young to ask like what are the things that are okay to say?

[2223] What are the things, what are the ideas I stand behind?

[2224] Like what are, especially if they're controversial and I'm willing to say them because I believe in them versus just saying random shit for the for the lulls.

[2225] Because for the random shit for the laws, keep that off the internet.

[2226] That said, man, I was an idiot for most of my life and I'm constantly learning and growing.

[2227] I'd hate to be responsible for the kind of person I was in my teens, in my 20s.

[2228] I didn't do anything offensive, but it just changed as a person.

[2229] Like I used to, I guess I probably still do, but I used to, you know, I used to read so much existential literature.

[2230] That was a phase.

[2231] There's like phases.

[2232] Yeah, you grow and evolve as a person that changes you in the future.

[2233] Yeah, thank God there wasn't social media when I was in high school.

[2234] Thank God.

[2235] Oh my God, I would never be got in the FBI.

[2236] Would you recommend that people consider a career at a place like the FBI?

[2237] I loved the FBI.

[2238] I never thought I would go anyplace else, but the FBI, I thought I was going to retire with the gold watch and everything from the FBI.

[2239] That was my plan.

[2240] You get a gold watch?

[2241] No, but you know what it is.

[2242] Oh, it's an expression.

[2243] You get a gold badge.

[2244] You actually get your badge in Lusite and your CREDs.

[2245] They put in Lusite and all that.

[2246] So does it, by the way, just on a tangent since we like those, does it hurt you that the FBI by certain people is distrusted or even hated?

[2247] 100%.

[2248] It kills me. I've never, until recently, not, I sometimes be embarrassed about the FBI sometimes, which is really, really hard for me to say, because I love that place.

[2249] I love the people in it.

[2250] I love the, the brotherhood that you have with, you know, all the guys in your squad, the guys and girls, I just use guys, you know, we, we, I developed a real drinking problem there because we were so social of going out after, after work and, you know, continuing on.

[2251] It really was a family, you know, so I do miss that.

[2252] But yeah, I mean, if someone can become an FBI agent, I mean, it's pretty fucking cool, man. The day you graduate and walk out of the academy with a gun and a badge and the power to charge someone with a misdemeanor for flying on the United States flag at night, that's awesome.

[2253] So there is a part of representing and loving your country, and especially if you're doing cybersecurity.

[2254] So there's a lot of technical savvy in different places in the FBI.

[2255] Yeah, I mean, there's different pieces.

[2256] Sometimes, you know, you'll see an older agent that's done, you know, not cybercrime, come over to cybercrime at the end.

[2257] he can get a job once he goes out.

[2258] But there's also some guys that come in.

[2259] You know, I won't name his name, but there was a guy.

[2260] I mean, I think he was a hacker when he was a kid.

[2261] And now he's been an agent.

[2262] Now he's way up in management.

[2263] Great guy.

[2264] I love this guy.

[2265] And he knows who he is if he's listening.

[2266] You know, that, you know, he had some skills.

[2267] But we also lost a bunch of guys that had some skills because we had one guy in the squad that he had to leave the FBI because his wife became a doctor and she got her residency down in Houston and she couldn't move.

[2268] He wasn't allowed to transfer.

[2269] So he decided to keep his family versus the FBI.

[2270] So there's some stringent rules in the FBI that need to be relaxed a little bit.

[2271] Yeah.

[2272] I love hackers turned like leaders.

[2273] Like one of my quickly becoming good friends is Mudge.

[2274] There's a big hacker in the 90s and then now was recently Twitter chief security officer.

[2275] or CSL, but he had a bunch of different leadership positions, including being my boss at Google, but originally a hacker.

[2276] It's cool to see, like, hackers become, like, leaders.

[2277] I just wonder what would cause him to stop doing it, why he would then take, like, a, like, a managerial route for high -tech companies versus...

[2278] I think a lot of those guys, so this is, like, the 90s, they really were about, like, the freedom, there's, like, a philosophy to it.

[2279] And when I think the hacking culture evolved over the years.

[2280] And I think when it leaves you behind, you start to realize like, oh, actually what I want to do is I want to help the world.

[2281] And I can do that in legitimate routes and so on.

[2282] But that's the story that, yeah, I would love to talk to one day.

[2283] But I wonder how common that is, too, like young hackers turn good.

[2284] You're saying it like pulls you in.

[2285] If you're not careful, it can really pull you in.

[2286] Yeah.

[2287] Yeah, you know, you're good at it.

[2288] You become powerful.

[2289] You become, you know, everyone's slapping you on the back and say, what a good job and all that, you know, at a very young age.

[2290] Yeah.

[2291] So, yeah, I would love to get into my buddy's mind on why he stopped hacking and moved on.

[2292] Oh, that's going to make a good conversation.

[2293] In his case, maybe it's always about a great woman involved, a family and so on.

[2294] Yeah, that grounds you.

[2295] Because, like, yeah, there is a danger to, hacking that once you're in a relationship, once you have family, maybe you're not willing to partake in.

[2296] What's your story?

[2297] From childhood, what are some fond memories you have?

[2298] Fond memories?

[2299] Where did you grow up?

[2300] Well, I don't give away that information.

[2301] In the United States for you?

[2302] Yeah, yeah, yeah, in Virginia.

[2303] In Virginia.

[2304] What are some rough moments?

[2305] What are some beautiful moments that you remember?

[2306] I had a very good family growing up.

[2307] the like rough moment and i'll tell you a story that just happened to me two days ago and it fucked me up man it really didn't you'll be the first i've never told them i tried to tell my wife this two nights ago and i couldn't get it out so my father uh he's a disabled veteran or he was a disabled veteran he was in the army and got hurt and uh it was in a wheelchair his whole life um for all my growing up he uh he was my biggest fan he just wanted to know everything about you know what was going on in the FBI, my stories.

[2308] I was a local cop before the FBI, and I got to a high -speed car chase, you know, foot chase and all that, and kicking doors in.

[2309] He wanted to hear another of those stories.

[2310] And at some points, I was kind of too cool for school.

[2311] And, ah, dad, I just want a break and all that.

[2312] And things going on.

[2313] We lost my dad during COVID, not because of COVID, but it was around that time.

[2314] But it was right when COVID was kicking off.

[2315] And so he died in a hospital by himself.

[2316] And I didn't get to see him then.

[2317] and then my mom had some people visiting her the other night and Tom and Karen Rogaburg and I'll say they're my second biggest fans right behind my dad they they always asking about me and my career and they read the books and seen the movie they'll even tell you that Silk Road movie was good.

[2318] They'll lie to you on that.

[2319] And so they came over and I helped them with something and my mom was that called me back a couple days later and she said I appreciate you helping them.

[2320] I know, you know, fixing someone's Apple phone over the phone really isn't what you do for a living.

[2321] It's not, it's kind of beneath you and all that.

[2322] But I appreciate it.

[2323] And she said, oh, they loved hearing the stories about, you know, Silk Road and all those things.

[2324] And she goes, you know, your dad, he loved those stories.

[2325] He just, I just wish you could have heard him.

[2326] He even would tell me, he would say, you know, maybe, maybe Chris will come home and I'll get him drunk and and he'll tell me the stories.

[2327] But then she goes, maybe one day in heaven, you can tell him those stories.

[2328] And I fucking lost it.

[2329] I literally stood in my shower, sobbing.

[2330] Yeah.

[2331] Like a child.

[2332] Like, just thinking about, like, all my dad wanted was those stories.

[2333] Yeah.

[2334] And now I'm on a fucking podcast telling stories to the world.

[2335] And I did tell him.

[2336] Yeah.

[2337] So did you ever have, like, a long heart to heart with him about like about such stories he was in the hospital one time and i went through and uh i want to know about his history like his life what he did and i think he may be sensationalized some of it but that's what you want you dad's a hero so you want to hear those things is a good storyteller um yeah again i don't know what was true and not true but you know some of it was really good um and it was just good to hear his life but you know we lost him and and and now those stories are gone.

[2338] You miss him?

[2339] Yeah.

[2340] What did he teach you about what it means to be a man?

[2341] So my dad, um, he was an engineer.

[2342] And so part of his job, we worked for, um, Vermont power and electric or whatever it was.

[2343] I mean, he, when he first got married to my mom and all that, um, like, he flew around in helicopter, checking out like power lines and dams.

[2344] He, he used to swim inside to scuba into dams to check to make sure they were functioning properly and all that.

[2345] Pretty cool shit.

[2346] And then he couldn't walk anymore.

[2347] I probably would have killed myself if my life switched like that so bad.

[2348] And my dad probably went through some dark points, but he had that from me, maybe.

[2349] And so to get through that struggle, to teach me like, you know, you press on, you have a family, people count on you, you do what you got to do.

[2350] That was big.

[2351] Yeah.

[2352] I'm sure you're making proud, man. I'm sure I do, but I don't think he knew that, that I knew that.

[2353] Well, you get to pass on that love to your kids now.

[2354] I try, I try, but I can't impress them as much as my dad impressed me. I can try all I want, but.

[2355] Well, what do you think is the role of love?

[2356] Because you gave me some grief.

[2357] You busted my balls a little bit for talking about love.

[2358] a lot.

[2359] What do you think is they're all love in the human condition?

[2360] I think it's the greatest thing.

[2361] I think everyone should be searching for it.

[2362] If you don't have it, find it, get it as soon as you can.

[2363] I love my wife.

[2364] I really do.

[2365] I had no idea what love was until my kids were born.

[2366] My son came out and, this is a funny story.

[2367] He came out and, you know, I just wanted to be safe and be healthy and all that.

[2368] And I said to the doctor, I said, 10 and 10, doc, you know, 10 fingers, 10 toes, everything good.

[2369] And he goes, 9 and 9.

[2370] I was like, what the fuck he's like oh this is going to suck okay we'll deal with it and all that he was talking about the apnicard cord or some score about breathing and color and all that and I I was like oh shit but no one told me this but so I'm just sobbing I couldn't even cut the umbilical cord like just fell in love with my kids when I saw them and that to me really is what love is like just for them man and I see that through your career that love developed which is awesome the the being able to see the humanity in people.

[2371] I didn't when I was young, the foolishness of youth.

[2372] Yeah.

[2373] You know, I needed to learn that lesson hard.

[2374] I mean, you know, when I was young in my career, it was just about career goals and, you know, and resting people became stats, you know.

[2375] You rest someone, you get a good stat, you get an out of boy, you know, maybe, you know, the boss likes it and you get a better job or you get, move up the chain.

[2376] It took a real change in my life to see that humanity.

[2377] And I can't wait to listen to the, to you.

[2378] your talk was just probably hilarious and insightful, given the life of the two of you lived and given how much you've changed each other's lives.

[2379] I can't wait to listen, brother.

[2380] Thank you so much.

[2381] This is a huge honor of your amazing person with an amazing life.

[2382] This is an awesome conversation.

[2383] Dude, huge fan.

[2384] I love the podcast.

[2385] Glad I could be here.

[2386] Thanks for the invite.

[2387] So exercise in the brain, too.

[2388] It was great.

[2389] It was great conversation.

[2390] And the heart too, right?

[2391] Oh, yeah, yeah.

[2392] You got some tears there at the end.

[2393] Thanks for listening to this conversation with Chris R. Bell.

[2394] To support this podcast, please check out our sponsors in the description.

[2395] And now, let me leave you with some words from Benjamin Franklin.

[2396] They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.

[2397] Thank you for listening, and hope to see you next time.