Insightcast AI
Home
© 2025 All rights reserved
ImpressumDatenschutz
#1951 - Coffeezilla

#1951 - Coffeezilla

The Joe Rogan Experience XX

--:--
--:--

Full Transcription:

[0] Joe Rogan podcast, check it out.

[1] The Joe Rogan Experience.

[2] Train by day, Joe Rogan podcast by night, all day.

[3] Nice to meet you, man. Hey, thanks a lot.

[4] I appreciate what you do.

[5] What you do is a very valuable service.

[6] Because you go so deep on some of these scammers.

[7] It's like it's so important.

[8] Because there's so many people that just, they don't really understand what's going.

[9] Like the FTX thing, for example, the best one.

[10] Because I was so in the dark about this thing.

[11] I was like, what is happening?

[12] Like, what are they doing?

[13] Try to break it down for us.

[14] Like, first of all, what is a, it's a crypto exchange, right?

[15] So how does that work?

[16] So the first question is when you learn about crypto, you're like, it's this magic internet money.

[17] Magic.

[18] How do you get some of that?

[19] How do you get some of that magic internet money?

[20] Well, you have to go somewhere to buy it.

[21] And so it's a crypto exchange is where you kind of go.

[22] You put your fiat on, your dollars or whatever, euros or whatever, and you, Put it into this crypto exchange.

[23] They have a bank, and they work with that bank.

[24] Then they exchange that money for some type of crypto token.

[25] There's a lot of different tokens out there.

[26] Explain tokens, because I don't understand tokens.

[27] I know there's crypto and there's tokens.

[28] Like, what is the difference between the two of them?

[29] Yeah, tokens like is the individual.

[30] You can think of currency, right?

[31] So it's like the individual.

[32] So Bitcoin is, you have Bitcoin.

[33] Then you have, it's one of the cryptocurrencies.

[34] You have Ethereum.

[35] You have Dogecoin.

[36] You have Safe Moon.

[37] You have FTT, which is what FTX was using as their native token.

[38] So a lot of these guys, you'll start a crypto exchange and then you'll launch your own token that people can invest in, sort of like they're investing almost in your crypto exchange.

[39] And so that was actually one of the ways that FTX really perpetuated their fraud.

[40] I can break it down.

[41] How much do you know about the FTX situation?

[42] Let's break it down for people that don't know about it.

[43] Let's do it.

[44] So, FTX was this crypto exchange located out in the Bahamas, which is a great place to put your...

[45] Why do they do it in the Bahamas?

[46] Because it's unregulated.

[47] So the problem with doing stuff in the United States or, you know, some something like Europe or something like that is you're subject to all these regulations which require you to be a little more careful.

[48] Oh, those are pesky.

[49] Yeah, they're annoying.

[50] We don't need that.

[51] The famous example is like Coinbase is in America.

[52] and they have to file all these forms, they have to be, they're a regulated entity, they're a publicly traded company.

[53] So they have to report everything.

[54] So if you're offshore, you can kind of not do any of that.

[55] You can play fast and loose.

[56] And, you know, for some people, they think that's better.

[57] They can offer, let's say, like, 100x leverage.

[58] Like, you have a dollar.

[59] I'll let you trade with $100.

[60] And that's, that's going to be, like, one reason you come to my offshore exchange.

[61] I can offer you more leverage than the guys who are, like, you know, Coinbase or something like that.

[62] Right.

[63] So FTX launches.

[64] Let's start with who Sam Bankman -Fried is.

[65] He's kind of at the center of all of this.

[66] Sam Bankman -Fried is this guy who comes, he's the son of two Harvard lawyers.

[67] Then he comes up, prep school.

[68] He's kind of like built for success, right?

[69] He goes to MIT, goes to Jane Street as this quantitative trader.

[70] And then he goes into the crypto space and he launches FTX.

[71] He's very young, right?

[72] How old is he?

[73] I think he was.

[74] He is young.

[75] I'm not maybe.

[76] Maybe.

[77] you can look that up, Jamie.

[78] 31.

[79] He launches Alameda research first, which is just like this trading firm, which basically the idea here is we have some ideas.

[80] We're going to raise a little bit of money, and we're going to do these trades that are profitable in crypto.

[81] So the way he first made his money was he did something where he bought Bitcoin in the U .S. And then he sold it on these Japanese exchanges where it was worth more.

[82] So he was arbitrage.

[83] this difference in prices.

[84] And then after he made his money that way, he launches FTX in 2019.

[85] And that's a crypto platform where, honestly, you can make a lot more money than just with a trading firm.

[86] So FTX quickly skyrockets in popularity.

[87] They bring on people like Tom Brady to promote it.

[88] Larry David in the Super Bowl.

[89] They kind of get buy -in from all these big sort of names and also reputable people like Black Rock, Sequoia Capital.

[90] They're They all invest in this guy.

[91] Kevin O 'Leary famously promoted it for like $18 million.

[92] They gave him $18 million to promote it?

[93] He says he lost it on the platform.

[94] He says the $18 million was on FTCS or whatever, and he never got a dollar out of it.

[95] But that was what the deal was for.

[96] So they were paying everybody to promote this FTX crypto exchange.

[97] And the idea was, is this is the next big thing, right?

[98] and this is where you're going to make money.

[99] There was a lot of fear of missing out or FOMO in the markets at the time.

[100] You know, everyone thought, oh, crypto's, you have to get in now, right?

[101] Because if you get in now, you're going to make some money.

[102] And so people invested in FTCS thinking that this is going to be a safe platform.

[103] This kid is smarter than everyone else.

[104] He's the son of Harvard lawyers.

[105] We just sort of can't lose.

[106] And nobody paid attention to some of the red.

[107] flags that were going on until ultimately was too late.

[108] It turns out he was pilfering FTX, the customer deposits, and was using it in Alameda Research, which was his trading firm, to try to make extra money, and he lost it.

[109] And so this is all because it's unregulated, like if he was doing this, like, Coinbase can't do this.

[110] Is that correct?

[111] Yeah, Coinbase is much more heavily scrutinized.

[112] They actually have to file with the SEC.

[113] They have to say what they have, where they're putting their money.

[114] They're subject to more regulation about like how they take care of customer deposits.

[115] One of the big things with FTX was they told people, hey, you put your money with us, we're not going to touch it.

[116] We're not going to move it.

[117] That's what FTX said in their terms of service.

[118] So one of the really big problems was they actually weren't doing that, but nobody knew because nobody had a look at their books.

[119] Like it was very opaque.

[120] Nobody knew what was going on behind the scenes.

[121] So even though they said, like, we're not going to touch your money.

[122] As soon as you deposited Bitcoin, I mean, I talked to some of the insiders at Alameda, they said, they had this backdoor system to where they could see you, Joe, deposit a Bitcoin on FTX.

[123] They could grab that Bitcoin and start trading with it immediately.

[124] Wow.

[125] Even though they were never supposed to be able to touch your money, obviously.

[126] That was the whole point.

[127] It's like, you deposit with us.

[128] We're not going to do anything with your money.

[129] It's your money.

[130] It's almost like a bank.

[131] Like you deposit with a bank.

[132] Your bank isn't supposed to go ahead and take your money and go start trading with it unless, you know, obviously we have FDIC insurance, stuff like that.

[133] But, like, they didn't have that.

[134] They just take your money, go trade with it, and that's where the disaster started.

[135] I really enjoyed you catching him on Twitter spaces.

[136] I really enjoyed that.

[137] I listened to that whole thing, because before that, you have this guy who's this, you know, whiz kid, who you listen to him talk.

[138] He has an answer for everything.

[139] He's so articulate.

[140] He's so knowledgeable.

[141] Like, I listened to previous interviews before he got busted.

[142] And then when you have him on, there's a lot of...

[143] I wasn't aware.

[144] I'm not sure.

[145] I don't, I'm not aware of that.

[146] I don't know.

[147] There was all this hemming and hawn and a lot of ums and ahs.

[148] And you just kept on him.

[149] It was amazing that he, first of all, it was amazing that he felt like he could do something like that.

[150] Like, why would he publicly communicate?

[151] This is one of the, this is why it's so interesting to me to look at fraud.

[152] Like, this is why it fascinated.

[153] me as well as I think it's an important thing to expose, but like, I'm interested in the characters who perpetuate fraud because they're such interesting psychological case studies.

[154] Sam Bankman -Fried, you could probably write a whole book about the fact that this guy, he got away with lying so long and perpetuating this image of himself as this generous billionaire, you know, he's sort of the next Warren Buffett, that when everything goes wrong, he thinks he can reestablish control because he's so smart, he is such a good liar, that he's like, I can just lie my way out it.

[155] So I think that's why he ultimately talked.

[156] His idea was, if I lied my way into it, I can sort of lie my way out of it.

[157] And this is what he did.

[158] So prior to this, I'd interviewed him twice before.

[159] And I had kind of gotten hamstrung with, like, you know, he's just so good at dodging stuff.

[160] Did you interview him before the scandal?

[161] No, not before the scandal.

[162] So it's like as it was going down.

[163] As it was going down, he goes on all these Twitter spaces.

[164] He doesn't want to, he's doing interviews with everybody.

[165] I ask him.

[166] And he doesn't want to talk to me. So, but he's going on these Twitter spaces.

[167] So I keep, I, like, was tracking when he'd go on a Twitter space.

[168] And I would contact the people ahead of time.

[169] I said, hey, at the end when you're, like, ready for this thing to go down.

[170] Because I know as soon as I get on, it's going to end pretty quickly after.

[171] I said, let me on.

[172] Let me ask him some real hard questions.

[173] Because all these guys are like, Sam, you know, we appreciate your transparency.

[174] Kind of kissing up a little bit.

[175] But I was just like, somebody has to ask him some real questions.

[176] So I had two prior little Twitter space interactions with him.

[177] And he kept getting away with the fact that, He blamed all the wrongdoing of FTX on Alameda research.

[178] And he said, I don't control Alameda research.

[179] Even though he was the owner, he's no longer the CEO as of 2020.

[180] He hands it to this, a girl he actually had a relationship with Caroline Ellison, right?

[181] And she supposedly controlled it.

[182] He said, she did everything.

[183] I don't have access to the book.

[184] Like, I basically knew nothing.

[185] So anytime you'd call him out on an issue, you'd say, where's the money?

[186] He goes, well, it's, I don't know.

[187] it's gone it's alameda research ask alameda so by the third interview i'd studied him and i said okay how do we get down to f tx's responsibility in this whole thing and i kept coming back to it was the terms of service that said you cannot move like when i deposit with you you're not going to touch my money and i said sam if that's true where's the money of all these people you there's no there's no Ethereum left.

[188] There's no Bitcoin left.

[189] You don't have the real tokens anymore.

[190] You just have your sort of nonsense, FTT tokens, the tokens you invented.

[191] And he said, oh, well, you know, there were some margin trading accounts.

[192] And I'm like, no, but there were people who didn't trade with margin.

[193] There are people who just put their money with you.

[194] And all they wanted was they wanted to store some Bitcoin with Tom Brady.

[195] They wanted to be alongside Tom Brady.

[196] So he's like, well, well, you know there was fungibility between wallets and it's like well what's fungibility mean it means whether you were a guy withdrawing who had who is this degenerate day trader or you were a grandma with who just put one bitcoin on there or you know more likely the grandson he treated all the accounts the same so when everyone came running for the money they just withdrew until nothing was left and ultimately because they had lost billions of dollars it left billions of dollars in credit claims, basically.

[197] They didn't have the money.

[198] And so now it's trying to be sorted out by the guy who literally unraveled Enron.

[199] And he says, this lawyer goes, it's worse than Enron.

[200] I watched the CEO, the new CEO, talk about it.

[201] Yeah.

[202] About him trying that's John Ray.

[203] Yeah, yeah.

[204] Trying to unravel it.

[205] And it's amazing.

[206] It's amazing that things with This amount of money can get this far sideways before anyone knows what's going on.

[207] This is the problem with offshore accounts and stuff.

[208] Like, actually, his whole technique of shifting the blame, like, onto Alameda and, like, I don't control Alameda.

[209] I've seen something very similar.

[210] I'm investigating this Ponzi scheme that's offshore.

[211] And, like, one of the first things the guy does is he controls it, but he renounce his ownership.

[212] He goes, oh, I'm passing it off to some sham director.

[213] and he goes, I don't have anything to do.

[214] I don't know.

[215] Where's the money?

[216] I don't know.

[217] But he controls everything.

[218] And so it's like this is the tactic of these offshore companies is like you put the right people in charge who are going to take the fall, you resign, and then you blame it on them later when everything goes wrong.

[219] His problem, though, is Caroline Ellison flipped on him.

[220] So she definitely flipped on him.

[221] She was smart.

[222] Yeah, yeah.

[223] She cooperated that her and I believe Gary Wang were big executives.

[224] They're cooperating.

[225] They pled guilty.

[226] They're cooperating with the feds.

[227] I mean, they did the smart thing, which is something like this happens.

[228] You shut up.

[229] You don't say anything.

[230] Right.

[231] And then you point at your boss.

[232] I mean, that's basically what they did.

[233] Which they for sure did stuff wrong too.

[234] You did not get to that level and not know that things were wrong.

[235] Well, reading her tweets about amphetamine use was pretty wild too.

[236] The whole scene was wild.

[237] The fact that they were all living together and fucking each other in this giant.

[238] Penthouse, this $40 billion penhouse, the things, it's insane.

[239] It's really, I almost wish it wasn't a scam.

[240] I've said this before because I was kind of, I root for nerds to like be that successful that you're just completely living outside the norms of society, just fucking each other on amphetamines and making billions of dollars.

[241] Like it sounds like a great story if it wasn't illegitimate.

[242] Yeah, that's, I mean, ultimately that's the problem.

[243] Like Sam was just hopped up on amphetamines playing.

[244] League of Legends while on Investor calls.

[245] Like, at the time, that was seen as this charming, like, genius thing.

[246] On calls.

[247] On calls.

[248] You could hear even actually, okay, so this is funny.

[249] They found his League of Legends account.

[250] And during some of the calls after it was a fraud, they could, you could hear him clicking in the background.

[251] He's playing league.

[252] Like, they tracked his account.

[253] He's playing league while on calls about the failure of FTX.

[254] Just imagine the arrogance.

[255] Is that arrogance or is it just?

[256] pure addiction.

[257] I think, you know, those multiplayer games, those, you know, online role -playing games, that's what that is, right?

[258] It's like a - It's a, I think it's a moba.

[259] I used to play, I used to play like a variant of league.

[260] So I know, I know it's fun, but, but it's not like fun of the point.

[261] Look, this guy's whole thing was like, I'm this effective altruist.

[262] I'm this guy who's going to maximize good in the world, you know, and that was his reason for working all the time.

[263] And That's the justification for to be hopped upon amphetamines.

[264] It's like maximize productivity, maximize human happiness.

[265] You can't do that and then say, oops, I played a little too much of my video games and lost billions of dollars.

[266] Like, this doesn't fly.

[267] Well, he wasn't saying that I played the video games.

[268] And that's why I lost the money.

[269] But after the fact, he's like, he's still playing video games.

[270] And you're like, can you have the decency to get off the video game and talk to people?

[271] So bizarre.

[272] It's just so bizarre that so many people got duped.

[273] And I felt the same way about Bernie Madoff.

[274] You know, I'm not a financially aware person.

[275] I'm not into the market.

[276] I don't follow these things.

[277] So when I see something like that go down, I'm like, how did he get Steven Spielberg?

[278] You know, how does someone like a Bernie Madoff or Sam Bankman -Fried, how does he get these people to do this?

[279] And in the FDX case, how much of it was getting celebrities to endorse the platform?

[280] It's huge.

[281] This is what I wanted to say.

[282] like the more I study this stuff, and you start to have repeat occurrences, like I just cover stuff all the time, and you see echoes of the same thing.

[283] I just had somebody just a couple days ago I was interviewing for this news scheme we're looking at it.

[284] And he said, you know, I never understood how Bernie Madoff got people because it seems so preposterous.

[285] And then I fell for something very similar.

[286] And what I noticed with all of these things, the threat is you believe, you know it's kind of too good to be true, but the social proof is overwhelming.

[287] And it overwhelms your kind of, like, alarm bells.

[288] So the social proof is a combination of things.

[289] So first of all, it's like, it's this guy who drives a Toyota.

[290] So you go, like, well, why does he need to scam me if he's driving a Toyota, right?

[291] Then it's like, which Sam Bankman Fried did.

[292] Then it's like, okay, Tom Brady backs him.

[293] Well, Tom Brady's got to have some guys who are looking into this.

[294] And then it's like, well, BlackRock backed him.

[295] Well, BlackRock definitely has some guys who looked into it.

[296] It's Sequoia Capital.

[297] They said he was, might be one of the first, like, trillionaires or whatever like he's such a great entrepreneur they think he's such a genius i actually it might have been uh one of the a one z guys or uh uh i'm blanking on the name right now but a one z what is that no no no i'm sorry uh i'm blanking on it's this famous i'm gonna remember it right after i get out of here it's one of the famous um like uh investment funds they invested in a of NFT projects.

[298] Mark Andresen, I think, is the guy who runs it?

[299] Jamie's looking it up.

[300] I thought I knew it off top of my head and now I don't want to say the wrong one.

[301] A16 Z. A16A1 Z. It was one of those, Sequoia A16Z.

[302] One of them wrote this glowing review of Sam basically saying he's going to be one of the first trillionaires.

[303] So all these guys basically, a lot of these people backed Sam with the highest endorsements.

[304] And so if you're just an average person, you're thinking, well how much more due diligence can I do than all these other guys all these other guys buy into him right and like and then they themselves are kind of also looking at each other being like well that guy did it like what it's the hottest deal around right uh Kevin O 'Leary's in so so you kind of think you're swimming safely with like other savvy investors and that's what ultimately gets you to buy in Bernie Madoff is very similar I mean he was uh you know really well regarded in Wall Street.

[305] So when people invested with him, they didn't, they knew the like returns were insane, but it wasn't like he was some random fly -by -night guy.

[306] He was well respected in the Wall Street space.

[307] People thought he might take over the SEC after like the current person to step down.

[308] They thought he was going to take it over.

[309] Like he's one of the leaders of the NASDAQ.

[310] I mean, he was one of the go -to guys.

[311] And so you thought, well, I invest with Bernie.

[312] Like, I can't lose.

[313] It's like almost, you know, betting on the house.

[314] Like the house always wins, right?

[315] So when FTCS was taking off, it just seemed like everyone who was a someone was backing him.

[316] So then it was okay.

[317] And then I think a lot of these people deferred to their other friends.

[318] They're all saying it's okay.

[319] Let me put money in.

[320] And it's just a huge case study that just because other people fall for something doesn't mean you're safe.

[321] Like you have to do, I hate to say do your own research because that's such an overused, like scammy phrase.

[322] It's actually such like a phrase like that, you know, it's almost used.

[323] Chemtrills.

[324] Let me say this.

[325] If it's too good to be true, if they're offering market returns that you want to believe in, you go, man, I want to believe this is real.

[326] Don't invest.

[327] Like, that's a bad idea.

[328] People were calling bullshit, though.

[329] Just like they were calling, there was a few people that were wary that were calling bullshit on Bernie Madoff.

[330] And there was a few people that were standing out and saying this, none of this makes sense.

[331] Right.

[332] and who were those people so there were a few people um there was a matt levine interview with uh sam bankman free didn't call him a fraud outright but he's like hey it seems like you're in the ponzi business and business is good whoa and what did he say to that he's like well you know like think of it like a box and you know you tell a bunch of investors you know hey if you put money in this box we can get some money out we can give you this yield he starts to explain like this thing that sounds exactly like a Ponzi scheme.

[333] And so ultimately, Matt Levine's like, uh, yeah, this doesn't really make a lot of sense.

[334] But again, it stopped short of this is a fraud because, you know, no one knew.

[335] There's a bunch of backing.

[336] So I made a video at the time being like this crypto CEO just describes a Ponzi scheme.

[337] And that video has aged so well because it's like, people are like, oh, it was all true.

[338] But like, but people were outright calling it a fraud like Mark Cohotis.

[339] He's a famous shortseller.

[340] He was calling that a fraud early.

[341] I have a buddy of mine.

[342] He goes by Dirty Bubble Media on Twitter.

[343] He's like one of the A &A Twitter accounts.

[344] He was calling it a fraud.

[345] You know, there were things that were coming out like questions about, you know, okay, they say they have all this money.

[346] Where?

[347] Where on chain is it?

[348] So like the blockchain, everything's publicly, you can see it, right?

[349] It's all at some address.

[350] And so people were asking like where's you say you have all this bitcoin where's the bitcoin you say you have all this ethereum where's the ethereum and why are why is so much of your balance sheet made up of your own tokens it's a big question so one of the things that ftx had done and a lot of companies were doing at the time but ftx was sort of the worst offender is let's say i give you a like let's say i give you a loan joe so an unsecured loan would be i give you a million dollars and i don't ask for anything.

[351] So if you default on that loan, I'm out of a million.

[352] Another way is I ask for, okay, I'll take some equity in the studio if something goes wrong, right?

[353] So I cover my butt if you default on it.

[354] Now, this is what was going on in crypto.

[355] They're called collateralized loans.

[356] But what FTX was doing was they were saying, hey, like, we'll take a million dollars from you, but instead of giving you collateral like dollars or like some asset, we'll give you FTT tokens, which is their own invented coin, and that should have value if anything goes wrong.

[357] And people were accepting that as value.

[358] But the problem is when at the exact moment FTX can't pay you back is the exact moment that FTT becomes worthless.

[359] So you think you have all this collateral.

[360] You think you have this backstop, because on the books, it's worth, you know, X dollars, let's say it's like worth $5 a coin.

[361] But what you're not realizing is the real risk is when FTC can't pay you back, they probably can't pay anyone.

[362] back.

[363] Everyone loses confidence.

[364] Everyone sells their FTT tokens.

[365] No one wants to buy it.

[366] It's worth nothing.

[367] So how did this all fall apart?

[368] Great question.

[369] So it's really interesting because it was like a battle between FTX and one of their competitors, Binance.

[370] So the owner of Binance is, I think it's Cheng Peng Zhao.

[371] He goes by CZ on Twitter.

[372] I probably budgeted his name.

[373] But he was actually originally sort of an ally of Sam.

[374] So he invested in FTX early on, put hundred million in and eventually got paid out like two billion dollars well yeah some of it was in this FTT token though so they have a bunch of FTT right and so it's like November was when all this stuff went down and a report comes out from coin desk where it shows FTX's balance sheet it shows actually what tokens they have you know for one of the first times it was kind of really everyone got to look at it all at once in one place and people notice like wait a second a lot of their assets are just their own tokens, like they had a serum token, which they controlled most of, FTT.

[375] So it looks like, if you just look at their assets, it looks like they're covering their liabilities.

[376] They owe customers $10 billion.

[377] It looks like they have $10 billion.

[378] But like most of this $10 billion is just their own tokens.

[379] So CZ takes this opportunity to kind of spread some, you know, sort of information about that.

[380] He says, hey, we're actually going to sell most of the money.

[381] of our FTT that we got from that deal and we're going to sell it.

[382] We don't know if we don't know what's going on there and all of a sudden it starts this firestorm because people are like there was already all this worry in the past that summer there'd been a bunch of companies that collapsed and people had never thought FTX it was kind of the first time anyone thought FTX could maybe not have the money.

[383] So CZ says hey maybe they don't have the money I don't know whatever I'm just going to sell some sell like two billion dollars worth or but he knew what he was doing.

[384] Oh for sure for sure.

[385] He's a shark.

[386] He's a shark.

[387] He knows what he's doing.

[388] What was the conflict between the two of them that led him to do that?

[389] The conflict was, it's a great question.

[390] The conflict was ultimately that Sam was trying to get some regulations passed, and he knew all the crypto people were trying to control regulations to favor their individual business situation.

[391] And so CZ felt like he was being cut out in Washington.

[392] And I think there was like a tweet from Sam saying like, oh, like I'll see you the next time you're in Washington or something like that.

[393] but like it was kind of a dig because he knows CZ can't go to Washington like he's a he would be he'd be afraid of being indicted or I don't really understand why he can't go but he can't go to America so Sam was meeting with regulators CZ felt cut out like he was basically going to get a bad deal with regulators they were all trying to Sam was working really closely with regulators to try to get regulations passed and CZ felt like he was cut out so that stirred up this like battle between them and ultimately, Sam goes, oh, you won like our battle.

[394] And people were like, you know, is it a battle when you lose billions of dollars of customer money?

[395] Like, well, how can you view this as a battle?

[396] Like, but he viewed it as like, we're sparring partners.

[397] Like, and you won this round or you won the war.

[398] Because he thinks this is going to go on forever.

[399] I initially.

[400] He thought he was going to be able to figure out a way to pull all the, the company's assets together and make everybody sound and read.

[401] repay everyone and go back to making money again?

[402] I don't think he thought he would repay everyone, but everyone thought like, oh, we'll just enter Chapter 11 bankruptcy, we'll restructure the company, we'll reopen.

[403] We'll just turn all the debt into new FTT tokens and pay everybody out.

[404] Oh, that's what he thought?

[405] He's on amphetamines, right?

[406] So he can't be thinking totally clearly and probably overly confident.

[407] Yeah, it's pretty clear.

[408] He didn't see like the full scope of the situation, especially at first.

[409] first.

[410] It seemed like he thought, you know, he was like saying FTXUS was fine.

[411] And then FTXUS went bankrupt.

[412] And he's the one who put it into bankruptcy.

[413] And then he's telling everyone, oh, no, no, the money's actually still there.

[414] I mean, he was constantly giving a conflicting narrative of what was going on.

[415] Now he's still like trying to say he did nothing wrong.

[416] He maintains he's innocent.

[417] And right now, actually, the big like kind of scandal now is they're finding a bunch of campaign finance violations because he was trying to influence politics, U .S. politics.

[418] I mean, it's insane how deep FTX's influence went from the Bahamas reaching into the United States while technically not really being regulated by the United States.

[419] Yeah, they were the number two donor to the Democratic Party.

[420] That's right.

[421] But also to the Republicans.

[422] This is what's wild.

[423] So Sam knew publicly in our current American.

[424] American climate.

[425] Like, it's kind of like, okay, it's a little bit chic to be donating to Democrats.

[426] You can do that without too much, you know, negative press.

[427] But if I'm the number three donor to the Republican Party, that's going to be a bad look.

[428] So he decides to donate dark to a Republican, like to Republicans.

[429] And part of the accusation is he knowingly did this through one of his executives, Ryan Salami or some.

[430] I think that's his last name.

[431] Salami.

[432] I don't know.

[433] I just don't know how to pronounce it.

[434] But he was like, yeah, the number.

[435] three donor to the Republican Party.

[436] But it was all orchestrated through Sam.

[437] Sam wanted to basically influence politics by just donating, donating, donating.

[438] And the idea is you donate to both sides you can never lose, right?

[439] If you have your hands in both pockets.

[440] But publicly, he's just like he's donating to Democrats because he says, oh, I'm like this like, you know, I care about all these issues.

[441] But it's like even more cynical than just buying one party is buying both lying about it so that you can get all the good press of like caring about all these social issues while also not caring at all and ultimately one of the ways like even some of the the candidates they donated to were through like a third employee we didn't even know about and they were like donating through them for like all these um LGBTQ plus uh causes and it was through a guy and the guy was like I feel a little uncomfortable with this and they said well we don't have anyone trustworthy at FTX we can donate through who's gay so like can you can you do this so someone had to be gay to do it like no they no basically they were like we need someone trustworthy we can trust to do this so hey you're going to be the guy like we're just going to do a few transactions through your name that just came out in a press release it's the new charges he was basically like a they call him straw donors because it's like if i give money to you to give money to a politician on my behalf you're a straw donor you're not really a donor so alameda was using customer funds to pay off politicians in order to try to get favorable regulation for, I guess, offshore crypto exchanges, right?

[442] And so these campaign finance, these violations, like, what exactly, like, what are the regulations in terms of like what you're allowed to do and donate and how did he violate them?

[443] So I think the big violation was you're not supposed to, like, if you're Alameda research and you're funneling money through a personal investor, that I think is the problem.

[444] There actually campaign finance laws I've heard are pretty weak.

[445] I forget the name of the law, but it was passed in like the early 2000s, 2010s maybe, where it actually became very easy to donate dark, where it's like you can donate through super PACs, political action committees, and you can donate as much as you want and you don't have to be, your name has to appear nowhere.

[446] And so that's actually what he said in one of the interviews.

[447] He goes, no one believes me when I said I donated dark because no one believes anyone would be like everyone wants the credit for donating.

[448] No one believes that I just do it on the sly.

[449] And that's ultimately what he was doing.

[450] But it also looks like he was donating through some of his executives.

[451] And I mean, the whole thing was shady all the way down.

[452] So the person's not named in the report who is donating to Democrats.

[453] We know the one of Republican donating to Republicans was Ryan Salami.

[454] But that person eventually said, well, hey, can we restructure all this money that went through me like a loan so that we can, you know, say that I took a loan out.

[455] I was donating so we didn't violate any laws.

[456] They never ended up doing that.

[457] But like it was very clear the internal conversations were they knew they were committing fraud.

[458] They knew they were doing things wrong.

[459] And this idea was, well, no one's going to catch us, right?

[460] Nobody's ultimately going to find out what we're doing here.

[461] How many more of these are out there in the world?

[462] As big as FTX, we don't know.

[463] I mean, there's only a few that are bigger.

[464] Like there's Binance, very opaque company.

[465] We don't exactly know.

[466] And what has happened to Binance since FTX went down?

[467] Because it seems like they received additional scrutiny, right?

[468] Because now people are starting to look at it.

[469] and I saw that their value went down considerably.

[470] Yeah, Elizabeth Warren wrote a letter to them.

[471] They're being looked at much more closely.

[472] I mean, ultimately, all of these things are so opaque in the sense that you can know their assets.

[473] So it's like a big thing recently in crypto.

[474] They'll say, hey, we're going to show you proof of reserves.

[475] What does that mean?

[476] They mean, we'll show you on -chain all our assets.

[477] You can check yourself.

[478] Like, I have a billion dollars in USDC.

[479] Well, that's great.

[480] But it doesn't matter if I have a billion dollars in crypto, Bitcoin, whatever, if I owe two billion dollars.

[481] That's what ultimately matters.

[482] How much do you have on deposits that you owe out?

[483] And so with Binance, we don't really know.

[484] The only one we have a little bit more of a look into is Coinbase.

[485] It seems like they're, you know, legitimate.

[486] But so much of the problem with crypto is we don't know how much of this stuff is money laundering.

[487] We don't know how much of this stuff is like outright the proceeds of criminals.

[488] I mean, we know that these criminals do launder their money through a lot of these crypto exchanges, through mixers.

[489] It's just sort of this big mess right now, and we're waiting for regulators to figure it out.

[490] Finally, regulators have stepped on the scene, but, you know, right now it's just this kind of wild, wild west of you're just having to trust these shady offshore entities that they're telling.

[491] the truth.

[492] Binance says they're fine.

[493] They show proof of reserves, but what are their liabilities?

[494] You know, it's hard to know.

[495] Um, so people really just take you at face value and they, they have to trust that like, oh, other people are invested.

[496] So I guess I'll jump into.

[497] And that's why the celebrities are important.

[498] And that's why the connection of black rock is important.

[499] Huge part.

[500] Yes.

[501] Because they're the legitimacy that says, hey, I'm, you know, I too am safe because Tom Brady's got his money there.

[502] So is the lure is like how Bitcoin used to be worth very little, and then one time, it was with the high of Bitcoin, it was like $70 ,000 or something like that?

[503] $60 plus, yeah, yeah.

[504] So that's the lure.

[505] The lure is you buy in for pennies and one day you're insanely rich.

[506] I'm sure you know about that one guy who lost a hard drive and who's paying people to go through a landfill to try to find his hard drive because there's billions of dollars worth of Bitcoin on that hard drive.

[507] Yeah, people lose their crypto keys all the time.

[508] I mean, it's kind of an interesting idea where you go, I'm going to get in before everyone else.

[509] But a lot of people found out about crypto the same time the mainstream media, everyone else did.

[510] So by the time they're actually investing, it's too late.

[511] I think the like most fair case you could make about crypto is sometimes national currencies aren't a great idea and you want an alternative.

[512] So like look at the Turkish lira, right?

[513] The inflation rate, I think, is like 75 % or something like that.

[514] Like it's like it's an unimaginable.

[515] It's just out of control inflation.

[516] And if you hold on to your Turkish lira, you're in for a bad time because every day it's getting less valuable.

[517] So the question is, what do you do if you're in that country making money?

[518] If you want to store your money somewhere else, how do you store it?

[519] So there's these idea of like these alternative current.

[520] that are kind of interesting.

[521] And then there's some arguments that like, hey, if you're someone like me, I have two employees and both of them are overseas.

[522] Like one of them's in London, one of them's in Ukraine.

[523] And so for me, I have to pay them and, you know, I have to do this wire transfer and it's kind of expensive to do these, like you pay all these fees for wire transfers.

[524] So the idea is like, okay, well, if you have crypto, those wire fees can go down.

[525] And instead of taking, you know, maybe a day or something, it'll take like five minutes or three minutes.

[526] So I don't want to give off the idea that like there's nothing here, but the problem is, is that with the lack of regulation and the ability to send peer to peer, which means like you and I can just send money to each other directly, no middleman, there's also a really huge opportunity for fraud, scams, and basically like, you know, shell, shell games where you're hiding the money.

[527] You're saying, oh, invest in this.

[528] This is going to become valuable later, but you actually own a bunch of that token.

[529] Then you sell it off and then the price plummets.

[530] So you thought you had a bunch of money, but actually it's worth nothing.

[531] Like there's all these new scams that have emerged as a result of people getting interested in this idea of an alternative money system.

[532] I mean, yeah, especially in our modern age.

[533] I mean, it seems like you can understand where they're coming from, the average person.

[534] They're like, look, I've been screwed by the banks.

[535] every time the government's printing a bunch of money where do I go right you can understand the appeal but it's just like you went from the you know the arms of one huckster to another like it's just like it's just like it's almost does something worth worse there are reasons that our banks have a bunch of anti -money laundering laws there's a reason that they have all sorts of finance laws it's not for it's not for their safety it's for your safety I mean, it's like they need to, you know, fight, like one of the best ways to fight crime is at their wallets.

[536] Like, you know, like take away their banking.

[537] And crypto has just really revitalized that because, you know, now if you're if you're some criminal, laundering money has just never been easier.

[538] Instead of taking $100 ,000 across the border or wiring it where it can get held up by a bank, now I can just send you $100K.

[539] It's going to take me five minutes.

[540] So that's why when people like kidnap people.

[541] data and things along those lines, they'd like to get paid through crypto.

[542] Ransomware, yes, yes, 100%.

[543] Because before it was like, okay, you'd need to use like Western Union or sort of one of these places where you can kind of send money without too much scrutiny.

[544] But even Western Union has been kind of, they've been getting kind of pinched a little bit, like, hey, you guys got to stop allowing all of this.

[545] But in crypto, there's there, because there's no middleman, because there's no one who controls like bitcoin like no one can say like no to a transaction now it's like there's there's nothing to stop you from sending that money and then and then you can take that money and you can send it to what's called a mixer which is this like fancy language for for a way to like anonymize your transaction like you put it you put a hundred thousand dollars into this little mixer and then it sends a hundred thousand dollars out later and nobody knows where that money came from what is a mixer how does it work it's interesting Interesting.

[546] So the most famous example is tornado cash.

[547] They've recently been shut down.

[548] But the idea of it.

[549] Just the idea of putting your money into tornado cash.

[550] Yeah, it's wild.

[551] Is there a better analogy for losing your house?

[552] You know what's funny?

[553] Yeah.

[554] You know, I mean, good Lord.

[555] The idea of these mixers was you'd anonymize your transactors.

[556] So like, let's say I put like one, let's say I have Ethereum, one ether into this mixer, right?

[557] this pool of money, a bunch of people are putting one Ethereum into this thing.

[558] So all this money is going in.

[559] And then you basically wait.

[560] And as you're waiting, Ethereum's going out everywhere.

[561] A bunch of people are withdrawing, right?

[562] Because they're also taking their money out.

[563] By the time you withdraw, there's nothing tying your Ethereum to your particular address to like this random external address because you send it to a different one.

[564] So before, it's like if I send you a dollar and then you send that dollar on, we can easily trace that back to me right it's like here here but if i send a dollar to you and everyone's sending you a dollar and then you're sending a dollar to all these other wallets that it's impossible to know which of those new wallets my dollars from it's like it's crazy it's a crazy idea that these uh basically nerds and cryptography thought of which is it's it's brilliant i mean it is brilliant because it is basically it's almost impossible to trace But ultimately, the outcome of that is like, yeah, I encrypt all your data.

[565] Joe, send me, I know you're the successful podcaster.

[566] I want you to send me $10 million or your data's lost forever.

[567] And you're like, call the police and you go, hey, track this guy.

[568] And they're like, to what?

[569] To a Bitcoin wallet?

[570] To Ethereum wallet?

[571] What are we tracking here?

[572] And then it goes to some mixer somewhere.

[573] And then we don't know where it goes after that.

[574] So when Sam Banking Fried was working with regulators, when he was trying to impose regulations or encourage regulations, how could that have benefited him as opposed to Binance?

[575] Like, what could they have possibly done to make it easier or more profitable for him?

[576] Like, why would he do that?

[577] I'm not as familiar with the regulation side of things.

[578] People were talking about that a lot.

[579] What I know is everyone's always interested in pulling up the ladder after them and, like, building.

[580] the kind of the rulebook around like, hey, like, if you're from this certain jurisdiction that we're a part of, you're fine, if you're not, like if you're from this one, you're not okay.

[581] Or I might say, hey, like, CZ has connections to China.

[582] Like, maybe that's a problem.

[583] Or CZ has connections to here.

[584] Maybe that's a big deal.

[585] But I'm from the Bahamas and I'm American, so that might be fine.

[586] I mean, everyone's always interested in the regulations benefiting them.

[587] The challenge now, though, is, you know, a lot of people had backed that bill.

[588] and now that it was all a fraud or the guy who basically pushed it was a fraud now they're like trying to retool it and it's like sort of what's left after the guy who kind of was spearheading this bill like was a fraud it's it's kind of tough I I was actually randomly like some senator's office reached out to me and they're like what do you think about this and I was like I don't know man you guys have to this is this is y 'all's thing to figure out ultimately.

[589] Ultimately, y 'all have to, my feeling is offshore entities should not be, they're not subject to our rules.

[590] How can you allow offshore like, yeah, I don't know.

[591] I don't know.

[592] It's very strange.

[593] And these offshore entities were also using like US branches, like there's FTX US, which was like more regulated, but not really that regulated.

[594] It's a little, it's a strange time, man, to be to be covering crypto because I tried to.

[595] tell people for years that this scam problem, this fraud problem, was going to undo sort of everything.

[596] Like if you don't root out the scams, you don't find ways to solve that.

[597] This is never going to work because if you're great, the money system has to be safe.

[598] Like your grandma has to be able to charge back her credit card when there's a fraudster, right?

[599] Or this whole thing doesn't work.

[600] You can't rely on people being technically savvy in order to, you know, make something work.

[601] If it's going to go to the public, you have to solve all these issues.

[602] And unfortunately, we saw crypto kind of go mainstream before they had really taken that.

[603] Maybe some of them were taking it seriously, but not enough.

[604] So if FTX didn't encourage regulation and CZ didn't get upset at that and sell off all his tokens, would they be still solving today or still in operation today?

[605] Would they not crash or was this inevitable?

[606] It was inevitable.

[607] So something to understand.

[608] FTX was insolvent long before it was realized that they were insolvent.

[609] Right?

[610] So that's the issue was FTX's problem was not CZ.

[611] Ultimately, he's kind of the guy who like pushed over the house made on sticks or something.

[612] Like it was never, the problem was the foundation was wrong from the beginning.

[613] if you don't have enough deposits to cover withdrawals, you just don't have the money, right?

[614] Your issue is that any time there's demand for withdrawals, you're going to encounter problems.

[615] So it was going to be inevitable any time any story broke that showed that maybe they're not as healthy as they should be, there would have been a run on the banks and people would have found out.

[616] It's just like, when are they going to find out?

[617] It's just he happened to be the final straw if that makes sense.

[618] So someone would have figured it out and someone would have started dumping their coin.

[619] Yeah, people already, I mean, even leading up, like, CZ gets a lot of the credit for it, but like already, like a day before they kind of shut down, myself and some other people were saying, like, we think they're insolvent because we had taken a look at their numbers and we said, there's no way they have the money for this.

[620] They don't have the tokens.

[621] So we were warning people, hey, this is probably insolvent, get your money out.

[622] But CZ ultimately was the big, like, he was the most notorious and, like, a well -respected person in the space to where people thought, okay, well, if he's saying it, you know, he's a guy who only says positive things about crypto, because he's a crypto, you know, executive.

[623] So if he's saying there might be problems, there's probably some problems.

[624] But Binance hasn't had similar problems.

[625] No, they've had a few runs, and they've covered withdrawals.

[626] I mean, so it's just the problem is a lot of it's a black box.

[627] I mean, so it's like things are good for now.

[628] Yeah, you don't know.

[629] That was the problem with the FTCS.

[630] Like, they had hidden all their numbers, like, Sam had literally had a $10 billion account that he mislabeled with Alameda Research.

[631] Mislabel.

[632] Yeah, he called it Fiat at FTX.

[633] But it was a $10 billion hole.

[634] What do you mean by mislabeled?

[635] Well, it was on a spreadsheet, Joe.

[636] So he was on a, put it on a spreadsheet for their balance sheet, and he mislabeled the account, Fiat at FTX.

[637] And so what prosecutors are now arguing is he knew, of course, what it was.

[638] He deliberately obscured what that was to hide it from people who were trying to take a look at his books.

[639] But it's just, that's what I mean by Black Box.

[640] You never know what games these guys are playing.

[641] Like they say, oh, here's sort of like the rough estimate of our balances.

[642] But oops, did I tell you about this $10 billion account?

[643] Like, I forgot.

[644] It's so silly.

[645] You find out, like, there were just no adults in that room, and, like, the few adults that there were, were, like, you know, they had, like, a criminal lawyer.

[646] Well, anyway, I don't think he's actually been convicted of anything.

[647] I shouldn't say that.

[648] They had this guy, Dan Friedberg.

[649] Shady, shady, shady.

[650] This guy, Dan Friedberg.

[651] No, he's definitely shady.

[652] I'll say that.

[653] What was his, I remember, but what was his?

[654] his whole thing was he did this thing with ultimate bet so he was one of the lawyers so there's this poker site called ultimate bet and he got caught in this scandal where they had enabled this thing called god mode on ultimate bet where the CEO could see everybody's hands and play on the site seeing everybody's hands so he just he cleaned up on all his own like his own customers just basically taking their money like oh i know exactly when to fold i know exactly when to bet so he had god mode enabled and then they found out somebody found out about this god mode and so the lawyers like how do we basically cover this up dan friedberg's like how do we you know uh you know what do you want me to do and he's like hey just make this problem go away this is the CEO like go blame it on somebody else go blame it on some like you know third party that got access to our our website say it was like a glitch or something.

[655] And so that that is the experience of the lawyer that FTX then hires is like being complicit on a private call, leaked private call, trying to cover up this God mode scam.

[656] That is his background.

[657] And so I asked, you know, Sam, I was like, you know, what does it say if this is your chief regulatory officer?

[658] This guy who enabled God or who helped cover up God mode.

[659] And he's like, well, I don't want to comment on other, you know, people, or it's just like...

[660] Well, how does he skate on that?

[661] Like, how does a guy like that not wind up getting indicted?

[662] I ask myself that every day.

[663] There's so many...

[664] Is it a matter of time, or is it he's gotten away with it?

[665] So many of these scams are like these issues of either regulators not having time, not having the resources, not having sort of like, it's maybe not big enough.

[666] you know they're good people a lot of the people going after these guys but it's like it's like trying to catch everyone who's speeding you know what I mean it's like people get away with it it's just there's too many people doing it you'll catch some people but ultimately a lot of people will just basically skate by even though by all rights they should have been you know caught in my view what he did was criminal that's why I started but it is it hasn't been prosecuted or anything like that but he's on a leaked private call everyone can go listen to it yourself It's just this shocking thing.

[667] And I think that shows, like, if you're running a shady empire, like, who's better than a shady lawyer to try to help you cover it up, right?

[668] How did you get involved in what you do?

[669] It's a weird thing.

[670] So when did you start your YouTube channel?

[671] So I started it a few years ago, 2018, 2019.

[672] And what was the first video?

[673] I started as sort of like an interview show, nothing about scams.

[674] I had a channel before it.

[675] So I went to school for chemical engineering and hated it.

[676] I was miserable.

[677] I was like, I do not want my life to be earning 2 % more of, you know, of a bottom line for ExxonMobil or any chemical.

[678] I just wasn't interested.

[679] I was like, that's not my life.

[680] So I always wanted to sort of, you know, have a voice.

[681] And so I started a YouTube channel just doing random videos.

[682] I hadn't really found my footing.

[683] But throughout my entire life, I had kind of had this relationship with like hucksters and fraud where, you know, when I was in high school, my mom got thyroid cancer, very treatable kind of cancer and she's fine.

[684] But at the time I watched her as she's like, gets this diagnosis, gets swept up with all these hucksters who are telling her that the way to treat thyroid cancer is not surgery.

[685] You can just treat it naturally.

[686] Just don't worry about, hey, don't listen to the, you know, the doctors.

[687] Don't listen to your general practitioner.

[688] You can just treat it with like colloidial silver or with just just put a bunch of garlic cloves in the pot.

[689] I still remember our house like reeked.

[690] She would put 60 clothes of garlic in like in a stew and she would drink it up because she thought that would make her better.

[691] Ultimately my dad convinced her like you got to get the surgery.

[692] Like this ain't this ain't going to fly.

[693] You have to ultimately get the surgery which thankfully she did and she's fine now.

[694] She takes medication to replace the hormones her thyroid would generate.

[695] But I saw my mom kind of get swept in this thing that I knew was nonsense.

[696] But it's sort of like hard.

[697] You kind of have to disprove every single, like there's always a new like health guy telling you that there's some new alternative discovery, whatever.

[698] And I was like, this is kind of weird.

[699] And I was like, why do they hate doctors so much?

[700] And it always seems to like end up with a sales pitch.

[701] Like it never was like, hey, let me just give you this free thing.

[702] It was like always like there's something, there's a catch.

[703] So I didn't really know what I was looking at at the time.

[704] Then I go to college and all my friends get an MLMs, multi -level marketing, you know, sort of like just like the like, hey, you're going to get rich.

[705] So I was always getting invited to these like get rich seminars.

[706] And, uh, and I'd go because it was like my friends like said, hey, we have to get somebody.

[707] You know, you want to go?

[708] And I was like, sure, I was like kind of fascinated.

[709] And you'd see these guys, you know, they're like, hey, don't work a nine to five job.

[710] Like be free like me. And I'm like, you're here on a Sunday at like, 5 p .m. How free are you really?

[711] Like, you're just like, you're just kind of grifting here.

[712] And so, but I, but you'd seem a nice cars.

[713] And so I was like, what is the, what am I looking at?

[714] And then as I'm doing my YouTube show, I'm like, I get fed a bunch of ads, like get rich quick schemes.

[715] Like you've got a bunch of people, you know, flexing in their Lamborghinis telling you, you know, they're like 25 years old telling you, you want to get rich by 25 or 22.

[716] I'll show you.

[717] I made a million dollars.

[718] I'm a millionaire by the time of 23 years old.

[719] Just buy my course.

[720] My course is, you know, $2 ,000.

[721] Pay me $2 ,000.

[722] I'll teach you to get rich quick.

[723] So I saw all this and it all, that my experience is up to that point.

[724] It kind of led me to like, I want to say something.

[725] Why is nobody saying anything?

[726] It just seemed like there was this, you know, these people pitching this stuff and nobody was talking about it.

[727] So I made this random video just basically screaming about, you know, all these scammers online.

[728] And unlike my previous work, which kind of had resonant, like it had gotten some reactions, but not much.

[729] what I noticed is it resonated with people beyond the views if that makes sense like I was just like there was something different about the reaction to it like and you know victims would reach out to me they'd be like hey I'd been scammed by this guy and I didn't realize what was going on and you showed me you know sort of like how the whole scheme worked so I decided to start pursuing it step by step and at first it was like just me discovering like well what is this well how does this scheme.

[730] Okay, so I buy this course and then what?

[731] What are you saying in the terms of service that means that I can't sue you?

[732] You have all these terms of service that basically say none of what I'm saying is true.

[733] Like they say they can get you rich in the sales pitch and then in the terms of service they said, results may vary.

[734] What's that about?

[735] I mean, ultimately it's like, and so I realize like, oh, there's this sophisticated way that they're praying on my psychology and they're setting it up with like, I used to be broke like you.

[736] Well, that's a strategy.

[737] A lot of these guys were never broke, right?

[738] And it's just part of the story you have to tell to be really effective.

[739] It's like, I used to be just like you, Joe, but then, you know, I found out that doing Amazon drop shipping is the way to make millions of dollars.

[740] And, you know, I used to fail, but by these little tricks, I found out how to be successful.

[741] And if you invest with me, I'll save you time.

[742] You know, you could do it yourself, Joe.

[743] You could do it.

[744] But what, it's going to take you five years.

[745] Get with me, and I'm going to shortcut your success.

[746] two months.

[747] You're going to be making five figures a month, 10 months, maybe six figures a month.

[748] And I've done it for people before.

[749] That's the social proof.

[750] I've shown people how to do this.

[751] You can watch them.

[752] These are real people, Joe.

[753] You can be just like them.

[754] And so I started watching this and I started seeing it.

[755] I'm like, oh my gosh, it's so interesting.

[756] I start covering it.

[757] And then I start to get cease and desist letters.

[758] They don't like that.

[759] So they start to send me, they say, hey, you better shut up or we're going to sue you.

[760] And I'm like, oh, my gosh, it's so I was like, okay, I'm not going to stop making these videos.

[761] I just kept making the videos.

[762] And ultimately, they never did.

[763] But I start doing that.

[764] And after I cover Get Rich Quick schemes for a while, I start hearing about these tokens.

[765] And they're like, hey, selling courses, like, it's always the new grift.

[766] You always have to find, because people figure it out.

[767] Like, they go like, oh, that actually doesn't work.

[768] Like, hey, drop shipping is not actually like this incredible business that you thought it was.

[769] that you're going to get rich easily.

[770] So don't do that.

[771] Go do crypto.

[772] You got to get into crypto now.

[773] And then it became NFTs for a while.

[774] But like, so I started, I eventually like pivoted into this crypto direction and learned all about that.

[775] But it started just from a curiosity about scammers.

[776] And I wanted somebody to say something because I was just like, why does this make some people tens of millions of dollars and nothing happens?

[777] why are some of these people making hundreds of millions of dollars people are miserable at the end of it and nothing happens and that that was the start of my show so you start just doing interviews about what like you just you didn't start doing this you started doing just like a normal interview show i was just doing a normal interview show with a few of my buddies um and it just was kind of i was just trying to find my way i was just trying to like even before that i had done a show where I was like trying to break down these topics.

[778] I was like researching addiction and I was just like trying to, you know, make some digestible piece of media around like addiction, right?

[779] Because like I, I always was interested in communicating complicated ideas in a digestible way.

[780] I just felt like, man, there's so much cool science out there.

[781] There's so many cool ideas out there.

[782] How do we communicate this?

[783] So I did that for a while.

[784] Then I started like, CoffeeZillow was like this spinoff channel.

[785] And I was like, let me do some interviews.

[786] And then it was also my place.

[787] I just threw things at the wall.

[788] So then I, that's where I threw one of my.

[789] rant like I just like ranted about this thing against the wall and it kind of like stuck and I just enjoyed it I was like man screw these people you know like we're taking advantage of like and what was sick about it is they're not taking advantage of rich people because rich people will sue you if you screw them over rich people will sue you right they're taking advantage of like people who they're like at $10 ,000 or $2 ,000 that's like all their disposable income and they're betting on these hucksters to dig themselves out of these situations and that's one of the things I try to tell people is like a lot of the success of these things is not from it's not even about greed it's about desperation when you fall for these things a lot of times you know you're like my mom like the reason she fell for these things is she so badly didn't want surgery that she was willing to believe anything right because she's like you know if you tell me and I have cancer and you tell me I can be better and you tell me it's $10 ,000 you tell me it's a dollar I'll pay you either way right And so people are financially, they feel like they're terminally ill financially.

[790] They're just like, I don't know how to get out of this.

[791] I feel like I have no opportunities.

[792] This guy, I'm watching YouTube.

[793] I'm trying to better myself.

[794] I'm trying to educate myself.

[795] And this guy comes on and tells me it's all click away, right?

[796] It's all a credit card swipe away.

[797] What has been the reaction?

[798] Like what has been the most visceral or violent reaction to what you've done and exposed?

[799] I think the biggest story we probably ever broken was either the FTX stuff, but that was already kind of going on.

[800] It was probably the Logan Paul story, the Crypto Zoo saga.

[801] That was a case where, you know, it's just the classic influencer greed story where this guy launches an NFT project does millions upon millions of dollars in sales and delivers nothing.

[802] He promises the world, a fun blockchain game that earns you money.

[803] and he he did nothing and the project was left abandoned and people were like miserable complaining complaining no one says but i'm not aware of that i'm not aware of that i'm kind of aware that you covered it but i don't know the story so let me let me back up then so uh Logan paul is popular influencer you know who yeah sure um so he along with a lot of influencers got really interested in like the crypto space and he had done a coin before that called dink, which was abandoned shortly after he promoted it.

[804] People got invested.

[805] It goes to zero, right?

[806] And he says, well, that's not my project.

[807] That was my buddy's project.

[808] And then like a month later, he's like, I actually do have a project, excited to announce it.

[809] It's called Crypto Zoo.

[810] It's a fun game.

[811] They called it a fun game that earns you money.

[812] Basically, the idea is they're going to sell you these two things.

[813] Eggs is NFT.

[814] And then there's a coin aspect to it called zoo tokens, okay?

[815] So you can buy these zoo tokens to buy the eggs, and the idea is the eggs will then hatch into animals that will earn passive zoo tokens.

[816] So you can buy eggs with zoo tokens, and then the eggs will passively earn you zoo tokens.

[817] Does that make sense?

[818] No. Well, don't worry.

[819] You're kind of actually caught up.

[820] So these zoo tokens were basically this passive income.

[821] You know, you basically invest up front, and then you're sort of getting the tokens back out, which you can then sell, I guess.

[822] So that was the idea of pitch to people, and people immediately buy in.

[823] Three million dollars in NFT sales, tens of millions of dollars in the tokens itself, the zoo tokens.

[824] People are so excited about it.

[825] Because it's Logan Paul.

[826] And he says, this is his project.

[827] He's putting his name behind it, his backing behind it, and he's a great marketer.

[828] I mean, you've got to give the guy credit where credit is due.

[829] He's a tremendous marketer.

[830] So people get all excited.

[831] All of a sudden, the hatch day comes when you're supposed to hatch these eggs and half the hatching doesn't work.

[832] How does the hatching work?

[833] Is it on a computer model?

[834] It was on the blockchain.

[835] So you could like, your NFTs would turn into different NFTs.

[836] Like they would like, they would transform into the animals.

[837] They go from an egg to an animal.

[838] How?

[839] It's just blockchain coding.

[840] I mean, it's just, It's just...

[841] But how do they...

[842] Is it predetermined?

[843] Yeah.

[844] Like, how does your egg become an ostrich?

[845] It's just random...

[846] It's like, it's supposed to be randomly generated animals.

[847] So you...

[848] And so you might get a rhino, you might get a chicken.

[849] Exactly.

[850] And then you could like crossbreed your rhino with like a chicken and get like a rickin or something and get even more tokens.

[851] But...

[852] Is this it?

[853] Yeah, there it is.

[854] You get like...

[855] Bear shark.

[856] So like...

[857] So like...

[858] So people start to like...

[859] Is this still around?

[860] So they say they're going to go back and fix it now.

[861] So Logan, after being not involved for like a year, as soon as my video comes out, he goes, damn, what a coincidence.

[862] Like, I've been working on it.

[863] Like, I was going to, you know, make it, like launch it.

[864] In reality, he hadn't touched it for a very long period of time.

[865] But so sorry, to back up.

[866] Okay.

[867] Half the eggs don't work.

[868] And they're not actually earning anything.

[869] The whole time they said they're going to earn you these tokens, right?

[870] They're not earning anything.

[871] So the promises haven't been fulfilled.

[872] There's just sort of all this stuff going on behind and behind the scenes, Logan's Quiet.

[873] Come to find out, he had hired basically criminals who were selling on the back end, like some of the tokens.

[874] And he was sort of like, I don't know what his thing was.

[875] I think he realized like, oh, it's not going to be that successful.

[876] Let me move on.

[877] I think his mentality was, let me just move on, right?

[878] The problem, though, is you have millions of dollars.

[879] of investment in a thing that you promoted.

[880] You told everyone it was going to make them money, and then you never delivered anything.

[881] So my story was basically showing that, showing the victims of the scheme, and in response, he's like, well, I'm going to sue you for that.

[882] He said he's going to see you.

[883] Yeah, he said, I'll see you in court.

[884] And then the backlash against him was so severe that he releases a video saying, thank you, Coffeezilla, for showing at the world what happened.

[885] And I I appreciate it.

[886] I responded out of anger, but I'm going to make things right.

[887] I'm going to fix the game to what it was supposed to be, and I'm going to pay back $1 .7 million.

[888] I'm committing $1 .7 million to anyone who bought an NFT can get a refund.

[889] Now, there's a bit of an issue with that.

[890] So that's nice.

[891] I actually think it's great that that happened, but there's two issues with it.

[892] Number one, which is that the NFTs were only half a small part of the sale.

[893] They actually weren't even half.

[894] Because people bought these tokens.

[895] So the people who bought tokens get nothing.

[896] He's offering, you know, this refund on the NFTs.

[897] The other problem is he hasn't refunded the NFTs.

[898] I've been, I've actually reached out to him twice.

[899] It's been like over a month since he's done this.

[900] So he said he's going to do it.

[901] And then the Discord, like he's posting in this little chat room with the investors.

[902] After he said he was going to do it, he's posted nothing.

[903] There's no way to get a refund right now.

[904] So I keep asking him like, hey, where's you promised $1 .7 million to these investors.

[905] They're all waiting.

[906] It's been over, I think it's almost been two months now, and there's nothing.

[907] So it's like, you know, he says that he's refunding people, which sounds great for PR, and then it's just like radio silence.

[908] So what I'm ultimately looking for is some accountability from these guys.

[909] They're happy to make money from the endeavors.

[910] They're happy to potentially make millions of dollars from these, you know, different projects.

[911] They're spinning up.

[912] But the second accountability is asked for, you can't reach them.

[913] So is it, well, I would assume Logan's a very busy guy.

[914] Sure.

[915] I would assume that he probably didn't come up with this on his own.

[916] I would assume that someone probably came to him with this project.

[917] This is just total assumption.

[918] Guesswork.

[919] Guessing on my part.

[920] So we have text messages from behind the scenes.

[921] A lot of people, the people who were responsible for say Logan kind of spearheaded the idea.

[922] And he says he spearheaded the idea.

[923] So it was his idea?

[924] Yeah.

[925] And so he's working with someone, right, that probably assured him that this would work?

[926] Yeah.

[927] I mean, he had a, he had this team of a few guys who they didn't do much vetting into, and some of them turned out to be criminals.

[928] But, you know, my feeling is ultimately, no matter what happens, like when you take people's money, that's what I'm trying to like, on my show, I'm trying to tell these, like, influencers.

[929] Like, when you take people's money, it's different.

[930] When you tell them you're going to make them money and you get into the financial investment game, your responsibility is different.

[931] You can't just always pass the buck to like, oh, it was like a guy that's not that trustworthy.

[932] It's like, all right, that might be true.

[933] Then go fix it.

[934] Go hire some more guys that are trustworthy and fix the thing.

[935] And I think my experience, because I've talked to Logan, and that's why I know he didn't respond to me. me because I texted him.

[936] I said, hey, where's this money?

[937] He left me on red.

[938] But I've talked to him.

[939] And when I talked to him, you know, there's just sort of this feeling of he's like, I just don't want to think about this.

[940] I don't want to be, you know, he wants to focus on prime, which is successful.

[941] He doesn't want to be bothered with the victims of the scheme that he ultimately thought of in the first place.

[942] So it's still, is it possible that he's just gathering the money or working out a way to do it legally where it makes sense?

[943] It's very frustrating because you know at every turn it's just sort of like you know it is I want to say it's possible we just don't know and it's just sort of like when you promise people refunds like the longer you wait you know the less people are actually going to take that refund.

[944] If Walmart says hey bring in bring in this skull I'll give you a refund and you're like all right when can I bring it in and then I'll respond to you for two months they know that you're less like to actually take the refund.

[945] So I don't know if he's doing it because he wants less people to get the refund.

[946] He probably is busy.

[947] But my thought is a transgression of this magnitude where you're playing with people's money and livelihoods, you cannot take it lightly.

[948] And that's one of the things is these influencers got into this crypto space.

[949] I don't think they fully appreciated.

[950] They're now dealing with financial investments.

[951] And it's not a joke.

[952] It's not like a brand deal where, you know, if, NordVPN isn't as great as they said it was, you know, it's all cool.

[953] Right.

[954] It's now, it's your company, and you promise people you're going to make the money, and now you haven't said anything for over a year, then you say you're going to refund them and you don't say anything for two months.

[955] That's an issue.

[956] The whole crypto space and the whole NFT space is filled with weirdos.

[957] Like, everyone that I've talked to that wants to come to me with some idea, it's always, always very strange.

[958] Like when people have come to, you know, like my business manager with financial propositions, they're always, it's logical.

[959] Like it makes sense, oh, invest in this, this is a fund and it does this and this is how you get a return to your investment.

[960] None of that stuff ever made any sense to me. I avoided all of it, luckily, but I was propositioned by multiple different entities about these kind of things.

[961] And I was like, I don't know what you're saying.

[962] I don't know, like, why would anybody buy an NFT?

[963] Like, you know, oh, it's a non -fungible token.

[964] And then you put it in an NFT wallet and you have this thing.

[965] I'm like, but I have the same thing on my phone.

[966] I can take a screenshot of that NFT and I have it.

[967] Like, what is the thing, the physical thing?

[968] You know, it's like I understand like Beeple.

[969] Do you know who Beeple is?

[970] Oh, yeah, yeah, yeah.

[971] Yeah, so Beeple made that little giga chat.

[972] thing for us it's a piece of digital artwork yeah and you know he has an actual museum of digital art right and if you buy a piece from him you actually get a physical piece of digital art there's something there yeah i get it makes sense i like the the ape ape yacht club whatever the fuck that is like what's going on here like i don't i have a friend of mine who's an artist who made over a million dollars on nfts and i'm like what did you do and like he talks to me for 10 minutes and I'm like I don't even know what the fuck you just said yeah so let me start by saying so I work with a super talented digital artist so he does a lot of my set stuff so I have a lot of respect for you know the challenge of a lot of digital artists as opposed to physical artists is like if you're a painter you sell your paintings right if you're a digital artist how do you you print it out like what do you do so NFTs were sort of originally it was like this is for artists like this is a way for a digital artist now to legitimately sell scarcity in their work which previously they had no way of doing you still can take a screenshot but you don't own the nfti that like sort of the digital artist has sort of provisioned like this is the thing that matters so i've a lot of like in that way in that one way i like i get it i get why people wanted it to be you know to become the next big thing the problem is is as quickly taken over as an investment vehicle.

[973] Now it's like everybody's an art dealer.

[974] And now everybody's an art expert.

[975] And now we're trying to make a buck.

[976] Right.

[977] And anytime you get art involved with money, things get weird.

[978] But especially when you get art involved with quick flips and returns.

[979] And now we're going to all make money from this.

[980] That's when things get really weird.

[981] So like, I feel bad sort of for digital artists, legitimate digital artists who really do legitimate NFT work.

[982] I don't think there's anything wrong with selling your work as a digital artist.

[983] Like, what do you expect them to do?

[984] Not everybody can go work for like some random YouTuber.

[985] Like, you know, people have to earn a living.

[986] They do legitimate work and good work.

[987] But the problem is when greed gets involved, when people get involved basically promising, you know, money.

[988] In the case of the board abiot club, it's sort of like what they, their idea was, we'll start like almost like a country club where the NFT is the pass for the country club and like you can go chat with the like holders of this board API club and I guess the idea is like because it's expensive then you get in the room with you know people with money but I found that whole thing weird because of the like you know Jimmy Fallon's getting involved and like and then all these like mainstream celebrities you know start promoting this thing and it's like this is a little why is everyone doing it and then You come to find out that a lot of them had their board apes bought by this company called MoonPay, who is trying to, like, you know, use the celebrity's likeness to push that out.

[989] And it's just like, it's just a strange, what's actually going on here?

[990] Is it just about the art?

[991] It doesn't actually appear to be.

[992] I just don't understand how it worked.

[993] I don't understand how anybody looked at it and went, this is logical.

[994] I'm going to buy that.

[995] So think about it.

[996] So think about it this way, though.

[997] So I'm sure you played a bunch of games, video games, right?

[998] Have you ever played a video game where, like, they have, like, in -game, you know, skins and, like, different, like, outfits?

[999] Sure.

[1000] So people, so tons of businesses have been built, like, the entire free -to -play model of Fortnite, you know, Fortnite makes millions and millions and millions of dollars.

[1001] Their whole model is built on skins and, like, different, like, in -game purchasable items.

[1002] You don't actually own anything.

[1003] Ultimately, it just lives and dies with your computer.

[1004] NFTs are sort of, like, I guess the idea with NFT gaming or.

[1005] whatever it's like where you would actually own own it like the game couldn't take it away from you'd have some piece of art that you'd have some ownership of that would matter um again i think the challenge is is just like where greed and like marketers get involved it's just sort of like ruin everything with scams and fraud to where it's very tempting and i get the temptation to just throw everything out and go it's all just a fraud right because you see so much of it and so much of it is just like kind of people trying to scam you basically for, you know, and use especially celebrity likenesses to scam people.

[1006] Yeah, the celebrity part is a big key in all this, right?

[1007] I mean, it's a huge part.

[1008] This is how we get legitimacy for products now.

[1009] It's like sort of like endorsements.

[1010] Endorsements.

[1011] It's like you've got to find a guy to do it.

[1012] So ultimately like, and the AI stuff's scary because ultimately you'll get the AI deep faking you into, you know.

[1013] Yeah, there's one of me. There's one of me and Andrew Huberman was selling some supplement that's not real.

[1014] Right.

[1015] Yeah, Alph.

[1016] I don't know if the supplement's real, but I know that the commercial's certainly not real.

[1017] Yeah, they deep faked you and I think it went viral on Twitter for a bit.

[1018] Yeah, well, everybody knew was a deep fake, luckily.

[1019] It wasn't quite good enough.

[1020] Yeah.

[1021] And then, you know, we tried to figure out who's doing it and you just run into a bunch of shells.

[1022] It's like very difficult I'll tell you're offline who's doing it.

[1023] Okay.

[1024] I know.

[1025] I looked into it because I was curious.

[1026] I was curious.

[1027] And, you know, that same person had put out a lot of ads about, like, Kim Kardashian.

[1028] They had a deep fake of Kim.

[1029] They had one of you saying that, like, so they have one of you saying, like, this product's great, you know, go buy it.

[1030] And then there's another one where you were complaining that Andrew Tate launched it.

[1031] And you thought you were sort of like Andrew Tate's going after my brand.

[1032] because it's very similarly named to one of your products and so it's like it was kind of this hilarious thing where they were playing both sides it's like it's Joe Rogan's it's also Joe Rogan hate hates that it's out there because it's so good then it's like Kim Kardashian loves it they had every celebrity was like basically endorsing this thing all through AI and it's just the testament of our times like celebrities are the new sort of authorities for better and often for worse but like people use that as currency now and like with AI you can just like fake a lot of that stuff.

[1033] That's what I'm worried about is the volley.

[1034] I feel like this is the very first volley in a war on reality.

[1035] And that the way AI is structured, it's so, it's so prevalent.

[1036] And so like when you look at chat GPI and then you look at deep fakes and you look at the ability to take, I mean, there's a whole podcast of me interviewing Steve Jobs that doesn't, it's not real.

[1037] And it sounds like a real podcast.

[1038] There's a lot of podcasts Yeah, it's crazy Sometimes I'll check one I'll go, is this real?

[1039] I saw there's a bunch going around Now they can imitate anyone's voice Like they don't, I think you were probably one of the first Because you have so many hours of footage So they had a lot of training data There was a Canadian company that showed Like sort of proof of concept of this a few years back And I was like, oh boy, I know where this is going to lead Because they just took all the hours of footage So they basically have me at every pitch and tone and yelling and laughing, and they can have me say anything at this point.

[1040] Literally, and now they're getting really good at the inflection, because one of the problems with these AI tools was they were very monotone, and they can only imitate your voice in a monot.

[1041] But now they're getting better at like, okay, we'll accent the voice, and then we'll talk calmly, and then we'll be able to, you know, get more excited.

[1042] So that's a huge problem.

[1043] Have you seen the face ones, though?

[1044] That's the new one.

[1045] Jamie, can you pull up the new TikTok face?

[1046] have you seen this which face filters the new which one in particular I think it's like they're glam one there's a bunch of Twitter threads right now on it I've seen the glam ones it's amazing it's amazing how they can put makeup on you you look no no you look different yeah you look different yeah it's it's literally gonna be this new world where you won't know like catfishing is going to a new level yeah you have no idea what someone looks like there's a woman who did this ad and she was laying in bed she's like I don't have any makeup on and in the old ones like there's a really funny video of this person that I know actually who put this filter on and in one of the scenes she puts her hand in front of her face and the lips are superimposed on her hand and it looks so preposterous and the fact that she's so not aware of the fact that this thing is happening and she put the video out it's like we were laughing so hard like first of all you don't look like that everyone knows you don't look like that and then when you put your hand in front of your face you didn't see this fucking giant cartoonish fake lips that came over your palm this is so crazy so this is the one that I saw this woman like this is crazy yeah now if you touch yeah well now if you touch your face you should be able to they don't like superimpose any like it's all really real like you can do anything to your face and you can manipulate it and the AI tracks at all I mean and I've seen people do it where they have two screens, like one that's actually them and one that's them with the filter.

[1047] So you see it side by side.

[1048] It's shocking.

[1049] Yeah, it's really worrying, like, you know, these technologies.

[1050] Part of the problem is you could deploy them so cheaply and at scale to where, you know, in my world, I'm more worried about like the Joe Roke and deep fakes and like people scamming people have money.

[1051] But I also worry about like the romance scammers.

[1052] Yeah.

[1053] Like how good that's going to get when chat GPT now has all the scripts down.

[1054] and instead of paying someone to get it.

[1055] You have someone FaceTime this person.

[1056] Oh, yeah.

[1057] You have someone FaceTime them.

[1058] You have it all generated by an AI.

[1059] It costs you almost nothing to do.

[1060] I mean, one of the rise of, like, robocalls was it's just cheaper.

[1061] Like, it's really hard if you're going to hire people to do it.

[1062] You kind of need an ROI.

[1063] If you have robots, you know, sending spam, now it's, now it's good.

[1064] Because you don't actually need to earn that many dollars per call to make it viable.

[1065] So you just call everybody.

[1066] One of my daughters got a phone call about how much money.

[1067] she owes and then if she doesn't pay this amount right away that the authorities will be in contact with her and you know she was 10 and she was laughing and she's like what is this am i in trouble she plays it for me i'm like oh my god this is hilarious but it's just when you take really lonely sad people like i remember watch this um television show once it was some expose on this poor man it was just like this old divorcee who was being scammed by someone And I don't even think he had like a voice conversation with this person, but he traveled to the U .K. or somewhere in Europe twice to meet with this person that he'd been sending all this money to.

[1068] And both times something came up and the person couldn't meet him there.

[1069] This poor old guy just kept going there thinking that the love of his life was there.

[1070] And they interviewed his daughter and she was, you know, beside herself and she couldn't talk sense into him.

[1071] And they interviewed him and he was in denial.

[1072] It was just so pathetic and sad.

[1073] And what is that going to be like now with this kind of shit?

[1074] It's going to be a lot more prevalent and it's going to get a lot better.

[1075] I mean, the rise of the ability to generate like a realistic companion avatar is going to be, I mean, it's massive.

[1076] You know, these people were complaining to me the other day about this other thing, which you're going to find this wild.

[1077] So there's this app where you can basically have a girlfriend who's an AI.

[1078] where like the AI, you'll like, like basically, you know it's a fake, like you know it's all AI, but it's like a companion chat bot.

[1079] And, you know, I get a lot of emails, like, oh, such and such is a scam.

[1080] And usually it's like some Ponzi scheme or some get rich quick scheme.

[1081] This one, they were furious because the creators had sold it like, hey, you can have hot roleplay with this AI bot.

[1082] And then the people developing in the app one day said, hey, we're turning that off.

[1083] but the reaction from the community was like you took away my girlfriend oh jesus christ you took away my like you like my partner and these people had legitimately bonded with a bot what is that that's the Joaquin phoenix movie yeah yeah what is she her her her her her yeah yeah that's really the premise of the movie but in the movie it was all just voice yeah now it's going to be some actual 3D person is this one that's it's replicate AI so that's like still the uncanny valley right you look at that and you'd have to have like really bad eyesight to think that's a real person AI shuts down erotic role pay community shares suicide prevention resources over loss oh my goodness people were like miserable they're like they're like it's talking so and they would like complain like after an update they'd be like because I looked through their reddit I was like so curious.

[1084] It's like this is, this is like a new, brave new world, you know.

[1085] But they would, they would say, you know, ever since the new update, she's just not the same.

[1086] She's like talking to someone different.

[1087] And it's like, you know the back end is just like a large language model.

[1088] And they just clicked an update.

[1089] They don't care as long as they're getting this feeling, right?

[1090] You know, this is, it's really scary stuff.

[1091] Because I read this statistic recently that said that there's somewhere in the neighborhood of 30 plus percent of women are single, but it's in the neighborhood of 60 percent of men.

[1092] Really?

[1093] Yeah.

[1094] That seems really high.

[1095] I know.

[1096] It does seem really high.

[1097] It's, you know, 18 to 49 or something like that.

[1098] I don't remember the exact numbers, but it's young men.

[1099] It's a shockingly high.

[1100] It doesn't make sense, like, why there is such a disparity between the genders that men are so much more single than women.

[1101] Like, that doesn't even drive.

[1102] Yeah, how does that?

[1103] How does that make sense?

[1104] Most men are single.

[1105] Most young women are not.

[1106] Maybe the guys are just saying they're single.

[1107] And all the girls are like, we're in a relationship.

[1108] It is just a research, right?

[1109] So it's just a survey, I would imagine.

[1110] 30 % of U .S. adults are neither married, living with a partner, nor engaged in a committed relationship.

[1111] Nearly half of all young adults are single.

[1112] 34 % of women and a whopping 63 % of men like wow how does that work my how does it work if there's roughly 50 % women 50 % men how could 34 % of women be single and 63 % of men be single it says not surprisingly the decline in relationships matches a stride with a decline in sex the share of sexually active american stands at a 30 year low around 30 % of young men report It in 2019 that they had no sex in the past year compared to about 20 % of young women.

[1113] Only half of single men are actively seeking relationships or even casual dates, according to Pew.

[1114] That figure is declining.

[1115] What if, like, the women thought they were in a relationship and the guys are like that.

[1116] Right.

[1117] Yeah, that's what we could say.

[1118] Yeah, that you could say that.

[1119] Or maybe the women aren't being honest.

[1120] Maybe they've gone on a date with a guy and they decide that's their boyfriend.

[1121] I don't know.

[1122] I think the more shocking thing.

[1123] thing is just that more in general are single, less people are having sex and are engaged in meaningful long -term relationships.

[1124] I think that's, you know, there's just an increasingly I feel like we're becoming more atomized.

[1125] Like you just kind of can get lost in your world and you get these pseudo communities popping up.

[1126] Like if I'm a board ape yacht club member, I could call, you know, I might say, those guys are my brothers.

[1127] But, but are they, are they really like what is what are what are these new community internet communities doing right and what are they not really replacing in the real world because basically what that's what we've done we've replaced a lot of physical things with online things and sometimes that replacement works like i can you know i can kind but it sometimes it doesn't like i can like face time with my mom and it's like kind of the same but it's not it's not really it's a little annoying it's a little annoying right and they're getting better at it but it's like it's always kind of like this like facsimile of the real thing and so i think this replica ai is like it's sort of this like it's trying to treat loneliness and people maybe you could in that that's the nice way of looking at it um but it's kind it's pretty dystopian man it is dystopian and one of the things i think accelerated it was the lockdowns right so for especially people that had a lot of anxiety there was people that went a year plus without being in contact with other people other than their immediate family members and so then they seek more time online they're on online more and at the same time this AI generated 3D image of a person is communicating with you just that and then and then the rise of like like parisocial relationships you know they work for home yeah yeah people watch so much of online people they think they know you and like and they don't but but they feel like you're their friend rather than them having like online relate i was i was hanging out with a few friends um and you know they got approached by some people and these guys like felt like they knew them like they're like oh we know you like i love all your stuff that uh what do you think about they're asking about one of their friends like what do you think about when this guy did that and i'm thinking like this guy doesn't know you right it's but it's a strange thing where That's our new world.

[1128] It's different from like when there were celebrities.

[1129] You didn't feel like you knew Tom Cruise.

[1130] Right.

[1131] Well, that's different too because he didn't really talk.

[1132] He only talked on screen.

[1133] He's playing a character.

[1134] Yeah, and when he did talk, it was disastrous.

[1135] Like, remember when he had that interview with Matt Lauer and he was getting upset at Brooke Shields who was taking, you know, psychiatric medications and he's a Scientologist and they believe those are the devil.

[1136] And so he was telling him, you're being glib, Matt.

[1137] You're being glib.

[1138] And everybody was like, oh, my God, this guy's a psycho.

[1139] You remember those?

[1140] Ever since I watched Top Gun, I forgot.

[1141] That was disastrous to him.

[1142] But ultimately, he wrote it out with Top Gun.

[1143] He's kind of proven correct in a lot of ways because it turns out that the model of why they were using these SSRIs is not correct.

[1144] Like they work, but they're not sure why they work.

[1145] And the initial thought was that they were addressing some sort of chemical imbalance in the brain.

[1146] And now that's been proven to not be correct.

[1147] How do you think we go about, so it's sort of like managing these two things, right?

[1148] Like you manage the fact that pharmaceutical companies have profit incentives that leads them to want people to be on, you know, long -term drugs for, you know, ever.

[1149] That's the best kind of drugs, one you never get off of.

[1150] Right.

[1151] With the fact that like, on the other hand, you have a lot of like alternative health guys saying, hey, that's nonsense to listen to the guys.

[1152] they're also kind of a lot of them pushing a bunch of pseudo -scientific wackiness so it's very hard to figure out what's right and what's wrong and what's correct and what's propaganda yeah because you go like you go like oh tom has a point about these like all these like you know these pills and but it's like okay then is the answer is the answer nothing like it's hard to know it's interesting right because like the the question is like illness right There are certain medications like insulin for people that are diabetic.

[1153] These are like actual real solutions to an actual medical problem that's being created by a pharmaceutical company that addresses real issues and helps people.

[1154] And then there's also stuff like, hey, you know, maybe you need aneural.

[1155] Maybe you need to focus.

[1156] And so they're giving you speed, right?

[1157] And so it's basically, it's not based on a disease like I can't go to a doctor.

[1158] and the doctor says, hey, you have herpes, you need herpes medication.

[1159] And then this fixes your disease.

[1160] It's, I don't feel good.

[1161] Give me something that makes me feel good.

[1162] And then they give you something that makes you feel good.

[1163] You're like, okay, I'm on medicine because I have an illness.

[1164] Look, is that really what's going on?

[1165] But what else is causing that illness?

[1166] Do you exercise?

[1167] Do you sleep right?

[1168] Are you depressed because you have no meaningful relationships?

[1169] Are you depressed because you have a job that's horrific and stressful?

[1170] Like, what is causing this that you're just putting a Band -Aid over?

[1171] And that, so there's confounding issues that are all souped in together.

[1172] And no one's the same.

[1173] That's the thing.

[1174] It's like, how much of it is environmental factors.

[1175] Like, I can speak personally.

[1176] I have developed some, like, low -grade form of ADHD, but not because I was - What does it mean?

[1177] What does it mean?

[1178] Meaning, okay, so, like, in the past, I could read books for, hours and hours on end, right?

[1179] Like, I loved reading books due to how much I engage with social media.

[1180] And I'm someone who tries to monitor this stuff.

[1181] I was on a flip phone last year for like six months out of the year.

[1182] I mean, like I try to limit this stuff.

[1183] But because so much of my job is on social media and Twitter and I'm scrolling and the scroll is so addictive because it's, you context switch so much, so fast that it's like my brain when I try to lock into a book, it's like, it takes me a bit and I'm somebody who likes to read a lot I'd say I was like a voracious reader especially as a kid and like as I get older I'm having to sit down and it's more like work I like I have to intentionally like okay I got to read this book I'm not going to cut myself from distractions and I've all these apps on my phone to try to limit the amount of like screen time that I have because I'm just I know this is bad for my brain so I've given so I don't for me I'm like Adderall is not a good solution for me because my problem is not that I was born with this issue.

[1184] My problem is I'm on my device and my device is literally overstimulating my brain to when I don't have that overstimulation.

[1185] I'm just sitting in a quiet room with a book.

[1186] Now my brain's like, well, where is it?

[1187] Where's the, where's the, you know, interaction?

[1188] So for me, I think the answer is okay, for me, I just have to unplug more, right?

[1189] And that's what I try to do.

[1190] But for somebody who says, I was born like this, I can never pay attention.

[1191] Like, is the answer, you know, some people say Adderall helps them.

[1192] What do you say to those people?

[1193] So it, like, that's what I mean.

[1194] It's like it seems like an fire mental.

[1195] Yeah.

[1196] Well, I think A, you're addicted to your phone.

[1197] For sure.

[1198] Yeah.

[1199] Most people are.

[1200] Most people are.

[1201] Yeah.

[1202] I am very fortunate that I'm not addicted to social media.

[1203] I'm addicted to watching YouTube videos, which is a totally different animal.

[1204] And I'm also addicted to watching YouTube videos on things that I enjoy, which is better.

[1205] So I've filled that gap with things like or watch fight videos and professional pool matches.

[1206] It stimulates me in a way, but I'm not engaging with this context switching constantly like scrolling on Twitter.

[1207] I go to Twitter maybe five, ten minutes a day.

[1208] I go and I see what the fuck's going on like what is everybody mad at like who's in trouble like I'll shit scroll such a funny way to describe Twitter it's so accurate too but I do not post right if I post I post and ghost I just post and I leave it alone I don't read the comments ever I don't read any of my comments I think that's great that is very important for famous people it's very very important because I have friends that don't and they'll come to me and you know what they're saying I go how do you know what they're saying like what do you give a shit I watch people ruin their lives by looking looking at these like their screens now it's it's kind of hard because when you first when you first like kind of come on the scene you get a little attention it's like intoxicating and you want to engage too yeah because it's like at first that's that's fun it's like when you have a thousand people watching you like that's like beautiful yeah it's like there's this community they're resonating you have time to kind of you can respond to people and like intelligently when you start to get into the millions it it it is It's just ludicrous.

[1209] It just doesn't make sense anymore.

[1210] And it starts to be this like, your audience starts to become to you more, it feels more like a hive mind, even though it still is individuals.

[1211] It feels more like, okay, how do I get a pulse of what this actually is?

[1212] This is why people gravitate towards negative comments when they have huge audiences is because they go like, well, maybe they're right.

[1213] Maybe that like one guy represents the whole.

[1214] Of course it does it, but it's like they're worried because they don't really know what their audience thinks because it's so many people.

[1215] So I know it's the right thing to do to unplug.

[1216] At the same time, I'm like, okay, I have to know the current events.

[1217] I have to know what's going on.

[1218] So like I, that's one of the worst parts about, I love what I do, but it is the worst part of my job that I feel to some extent I kind of have to have my finger a bit on the pulse to know who's into what, what's big.

[1219] But then after that, the discipline is like unplugging.

[1220] have found is that if something is big enough that I need to pay attention, I'll find it.

[1221] I find it through other methods.

[1222] I find it through friends.

[1223] I have so many friends, like, do you know about this?

[1224] Do you know about that?

[1225] Like, even sometimes when people are mad at me, like, did you, like, what's going on with you and that person?

[1226] I go, what are you talking about?

[1227] I literally don't know.

[1228] And then they'll tell me, I don't want to look at that.

[1229] Like, leave it alone.

[1230] Like, I don't give a fuck.

[1231] But you'll find out.

[1232] You'll find out, you'll find out because people are talking about it.

[1233] You'll find out.

[1234] Like, let the addicts scroll, let them go crazy, but for your own mental health, it's not, and anybody who's public, like, you're a public person, you engage publicly.

[1235] You put your videos out and people comment on them and your videos get millions of views.

[1236] Like, that is not an environment where you can healthily sample people's opinions.

[1237] It's just not possible.

[1238] And human beings are designed to look for threats.

[1239] You're designed to find problems.

[1240] And so if there's one person that thinks you're a piece of shit and a hundred of them love you, that one person is the one you're going to think about.

[1241] And you're going to go, oh.

[1242] And they're confirming your worst fear.

[1243] Yes.

[1244] Your worst imposter syndrome.

[1245] They're like, you are crap.

[1246] And you go, oh, man, I knew it.

[1247] But even for people that are just regular people engaging.

[1248] Imagine people aren't talking about you because you're anonymous.

[1249] But you're engaging in this very shallow form of communication that's not natural.

[1250] You're engaging in a text -based communication with someone.

[1251] You don't know who they are.

[1252] You don't have any background on them.

[1253] You don't know if they're fucking schizophrenic.

[1254] You have no idea.

[1255] And yet you are investing your mind and your focus on these interactions that you're having with this person.

[1256] And most likely, if you're in a dispute, you're trying to win this dispute.

[1257] So you're trying to find reasons why they're.

[1258] wrong and you're getting anxiety and you're involved in this little sort of debate slash mental battle it's like fucking go outside go do something with your life like social media is fucking dangerous but it's not dangerous if you understand it it's like if you have a cabinet fill with cookies and chips it doesn't mean you're going to get fat you can always go into that cabinet every now and again and have a cookie and you're going to be fine yeah but if you just fucking open that cabinet every day and stuff your face you're gonna get diabetes yeah and what's hard is these these apps are built to be sweeter and sweeter and more fattening more fattening every year look at tick to like that's the best one that's the and that's where that's where i finally drew the line where i always try to stay up to date on all the apps you know and i have a family member who's young who like told me about ticot and they're like you got to get on this it was like back in like 2019 and they of course were right I should have I would no no no but at the time I just said like this is a step too far the the the shortening of our attention spans YouTube used to be short form that's like the funny thing is like then it was like TikTok and well it started with Vine but it's just like this new idea that like hey forget about 10 minutes let's talk try 10 seconds yeah for a video and that's where I have successfully disengaged I don't watch any TikTok or short form like Because that would be the end of my attention span.

[1259] And I feel bad for like, what do teachers do now when you're competing with like this never -ending feed of the most entertaining?

[1260] Well, the kids aren't supposed to have their phones and classes.

[1261] I have young kids.

[1262] But a lot of them sneak it and they figure out a way to juke the system.

[1263] But it's just, it's an inevitable fact of the progression of technology and technological innovation.

[1264] They're going to figure out a way to get people more engaged because it's profitable.

[1265] And there's going to be a better app than TikTok in the future.

[1266] A more addictive, more engaging act.

[1267] You have to imagine, right?

[1268] It's so funny to imagine now because you're just like, how could you?

[1269] But then we were thinking the same thing about YouTube.

[1270] You're like, wow, this is great.

[1271] Like this is so you can find anything, anywhere.

[1272] And now YouTube is like, oh, they're the, they're like the responsible, you know, like educational company.

[1273] I mean, because you can learn a lot on YouTube.

[1274] Oh, I've learned so much on YouTube.

[1275] I love it It is kind of an incredible platform And it is important to remember With all these new technologies Like there are good things But oftentimes the people Who are creating the platforms Don't really tell you about the bad things They're incentivized it down Exactly Their job is just to make something awesome It's your job to figure out your own life Yeah But it's you know The problem with things like TikTok And YouTube and Twitter And I mean this is what we're finding out with the Twitter files is that then other entities get involved in the process of censoring certain information and promoting a specific narrative.

[1276] And then when you find out the government's actually involved in that, well, that gets really shady.

[1277] Like, we need some sort of regulations and or laws to stop that from happening.

[1278] Or you need someone like Elon Musk that comes along and actually fact checks the president.

[1279] You know, when they started fact checking the White House, you know, actually that's not true at all.

[1280] and that's not why there's inflation.

[1281] You didn't do that.

[1282] It's amazing.

[1283] See the White House delete tweets out of shame.

[1284] But that's the world we're living in now.

[1285] But that's not the case with YouTube.

[1286] And with YouTube, there was some real problems, especially during the pandemic, with the censorship of accurate information that didn't fit a very specific narrative that they were trying to promote because of their sponsors.

[1287] How do you regulate that when one of the challenges is that the rate, and I know this firsthand, the regulators are so out of touch with the technology, because technology moves so fast that these guys, they were, a lot of these regulators were around when it was dial -up internet, and now they're in positions of power being asked to regulate things when they checked out, you know, know with email.

[1288] Yeah.

[1289] Yeah.

[1290] Yeah.

[1291] Well, you saw that when people were interviewing Mark Zuckerberg and they were talking to.

[1292] They don't know what they're talking about.

[1293] They're literally talking to him about problems with Google.

[1294] Yeah.

[1295] And he's like, hey, I'm Facebook.

[1296] And they're like, fuck are you talking about.

[1297] Yeah.

[1298] That is one of the bizarre things.

[1299] And so you rely weirdly enough on people to inform the politicians.

[1300] Well, who informs them?

[1301] Lobbyists.

[1302] Right.

[1303] And then you go back to People like Sam Bangman -Fried where it's like he's informing them with money.

[1304] Yes.

[1305] And he's like, hey, let me get a meeting with you.

[1306] So he gets a meeting.

[1307] And now he's a favorite on the hill because he seems like he's the response one in the room.

[1308] And it turns out he's a giant fraud, but no one noticed because they don't know what they're talking about.

[1309] Right.

[1310] They don't know what they're talking about.

[1311] And they're dealing with a million different issues all at once.

[1312] Does it make sense?

[1313] So I totally get the, you know, elect kind of older people because they have.

[1314] wisdom, but at the same time, does it make sense for there to be limits on age where you get more young people involved in these situations who actually know the technologies, especially on those special subcommittees where technology is such an important part?

[1315] Yes.

[1316] It makes sense to get people that understand it, and young people are going to be more likely to understand it.

[1317] But do you want people with a lack of wisdom?

[1318] Like, these are the type of people they were dealing with at Twitter.

[1319] They were dealing with young millennials that were deciding.

[1320] to censor information and to, you know, I mean, that was one of the problems that they had with issues like dead naming people, you know, like if someone can change their name and change their gender, and if you use their old name, like if you called Caitlin Jenner, Bruce Jenner, you'd be banned for life, which is bizarre because that person named Bruce Jenner won the fucking Olympics.

[1321] Like, what are we supposed to do there?

[1322] Like, you're doing this based on an ideology.

[1323] You're not doing this based on fact.

[1324] The actual fact is that person was born Bruce Jenner.

[1325] Now, to be kind and respectable to that person and refer to them in the gender that they want is nice.

[1326] That's a good thing to do.

[1327] But why is that problem something that gets you banned for life?

[1328] But you can call someone a cunt and that's fine.

[1329] I have no idea.

[1330] See, this is why I stay in my lane of scams because I'm just like, it's impossible.

[1331] Yeah, you have to.

[1332] It's impossible to.

[1333] And ultimately, like, one of the things I realize, so, so I consider myself a journalist, but one of my few privileges is that I don't have to engage in politics.

[1334] Yeah.

[1335] And it's, and it is a privilege because I see people just lose their mind.

[1336] Lose their mind.

[1337] In this culture war.

[1338] And it's like, I mean, I don't know anything about most of these issues.

[1339] And I'm like, I have.

[1340] I have expertise in like one thing.

[1341] And like, and it's, and I do have an expertise in it, but it's like, but I think now if you're a journalist and you're sort of on the, if you consider you politically align yourself, now you're expected to have a position on everything.

[1342] Yes.

[1343] Even if you have no idea what you're talking about, well, then you're expected to take the part, whatever the party line is.

[1344] You're expected to take it.

[1345] Even if you haven't considered it, considered it.

[1346] That's what happens.

[1347] And I've seen sort of people in the media become like co -opted.

[1348] by their audience where they may have to have these opinions.

[1349] Yes.

[1350] And so I feel lucky because I feel like there is no like mainstream thought and like scams.

[1351] I'm just like, let me interview a few victims and like they'll tell the story and that's great.

[1352] And I kind of stay away from that.

[1353] So I mean, for me, that's where that's where I always go back to.

[1354] I'm just like, I don't know.

[1355] That's a very good position because I've fallen into that.

[1356] I haven't fallen into audience capture, but I have fallen into the ideological game where if you're in one camp, you're supposed to have all the opinions that one camp has.

[1357] And if you do not align with all the opinions of that one camp has, you find yourself cast out of the group.

[1358] And I thought initially, wrongly, that what the internet was going to do was provide people with so much data and so much information that we would lose camps and that people would instead have a more.

[1359] more open -minded and centrist view of things and say, well, I could understand why people would think this because of that.

[1360] And I could understand why.

[1361] And we would have like more of a collective idea.

[1362] But what I didn't anticipate was social media and the echo chambers that it would provide.

[1363] Right.

[1364] And that these ideological echo chambers also come with virtue signaling and that people get on these things because you're only dealing with a short amount of characters and you state something that you know is going to get a bunch of likes and people are very addicted to likes and there was some talk about like removing likes because they realized that likes were an issue and then people freaked out just like those people freaked out about taking away your fucking chatbot girlfriend and they stopped doing that they stopped that idea but if you didn't know whether or not people agree with you or disagree with you I think that probably be better overall for people because I think that that whether or not agree with you or disagree with you is important, but you don't know those people.

[1365] It's important if you know the people and you respect them and appreciate them.

[1366] And that used to be the world.

[1367] Yeah.

[1368] The world used to be, you know, I go to Coffee Zilla and I go, hey, man, what do you think about this Ukraine thing?

[1369] And then I know you and I know like that you're honest.

[1370] And so I talk to you and you say, well, this is what I've read.

[1371] Right.

[1372] And this is what I think.

[1373] And then I go, oh, that's interesting because I thought this.

[1374] And you go, yeah, I thought that too.

[1375] But then I found out that and you go okay and you get sort of a more informed neutral position on what things are I don't think people are getting that I think there you mean there was a funny funny meme that came out right when the war started that was like the instantaneous change from people going from being health care experts to foreign policy experts oh it's hilarious it's very funny because that's what people do they find out what is the new thing that I can say that's going to get me like so me throw that Ukraine flag up in my Twitter bio alongside my gender pronouns and get after it.

[1376] And let's get some likes.

[1377] And now everyone's AI experts too.

[1378] Now they used to be crypto experts.

[1379] And now it's like everyone's AI expert.

[1380] Yeah, it's the classic.

[1381] It's like everyone's always current affair experts.

[1382] It's a weird thing how social media like it's an echo chamber, but it's a weird kind of echo chamber because it's not just what you think.

[1383] So if that were the case, that would kind of be.

[1384] obvious.

[1385] But it's also like you're shown the other side, but the most incendiary, insane side of the other side's views.

[1386] Almost to the point it's like caricatures.

[1387] If you fall, so if let's say you're a right winger, you know like, okay, the most insane people on the left are going to get the most likes for me because my camp's going to love it.

[1388] They're going to eat it up because they're going to look as insane as possible.

[1389] So you make them look insane.

[1390] The left wing people, they go, okay, let's select for the most insane right wing person and we'll put him out.

[1391] out there.

[1392] And so they both put out like these like sort of extreme views of the other side to their audience.

[1393] And then you and then if you're in that echo chamber, you go like, wow, those guys are literally insane.

[1394] I mean, because you think that's what the other team is just like agreeing with like, yeah, this is normal.

[1395] And meanwhile, the other team would be like, yeah, that's a little crazy.

[1396] But like, we actually think this.

[1397] We have a more moderate position on whatever.

[1398] So, you know, what I usually find is when you actually deal with individuals instead of like labels and ideologies, what you usually find is people are are pretty normal, but you know, we're just a lot of people have been caught up in this, in this battle.

[1399] And it's like a reaction to the reaction of the reaction.

[1400] Yep.

[1401] Where, you know, you go from like, okay, it was the mainstream media.

[1402] Then it was like, like independent media.

[1403] And then I find that like, you know, and I'm, I'm in independent media and so I understand the temptation as many as much as anybody to like dunk on mainstream media because it's like it's easy it's like great it's like you get right you know and they are wrong so often but then the mainstream media gets pissed off and they're like hey look you independent media you're just all you do is spend your time complaining about us what are you actually doing in terms of news gather are you on the ground what are you doing sometimes they are but you know I think the news the it's all just kind of decentralizing into a lot of different camps.

[1404] And there's good people everywhere and there's bad people everywhere.

[1405] There's great journalists, you know, who are trying to make a difference in bureaucracies at MSNBC or wherever.

[1406] There's great people, there's great regulators trying to make a difference.

[1407] But everyone's dealing with their own incentive problems and their own challenges with bias and their own echo chambers that they make mistakes.

[1408] And then when they make mistakes, the other team, just goes like ah yeah there's that and then there's also financial incentives yeah it's financial incentives that yeah I mean when you get motivated by whoever is your sponsor whoever is the advertising revenue provider for whatever show you have and that that becomes a gigantic issue when you see a mandate that gets pushed through and when you see people clearly moving in lockstep all together like a coordinated effort to discredit someone or to go after some topic or to to give a very biased and distorted version of something that clearly benefits the advertisers, it gets very sketchy.

[1409] And for the mainstream people to say, like, what do you do?

[1410] All you do is criticize us.

[1411] Well, that's a very valuable role, guys.

[1412] Like, that's a very valuable role because you people are fucked.

[1413] Like, you're not Walter Cronkite.

[1414] This is not the New York Times of 1970.

[1415] This is a completely different animal.

[1416] and it's an ideologically captured animal and then you have mainstream television which is almost bullshit it's almost like you could just say CNN is bullshit Fox News is bullshit how much of it is bullshit is it 30 % bullshit well if I gave you a sandwich and it was a cheeseburger but it was 30 % dog shit am I allowed to call that a cheeseburger now you have a dog shit infected cheeseburger right and that's what a lot of television news is.

[1417] And it's not news because they need to get you informed because it's like a service that they're providing because most people don't have the time to gather that information.

[1418] It's a propaganda disseminating entity that relies on advertising.

[1419] The advertising shapes the propaganda that gets disseminated.

[1420] That's fucking dangerous.

[1421] And so if independent media doesn't exist where someone is not captured by that, can't point that out, we've got a real problem with information because then it's going to be who has the most money and who can buy out the most media and there's a lot of that going on and that's scary it's scary for people that don't know the truth and it's it feels horrible when you get duped when you think that a mainstream story is correct and then you find out oh my god i got fucked yeah yeah well i i mean what i think is everyone the problem with like like uh pointing out financial incentives is like everyone has financial incentives.

[1422] Everyone, even independent media has to make a buck somehow, right?

[1423] But what I'll say is I've been on some of these like mainstream shows, not many of them, but you know, a few of them have invited me on.

[1424] And what I've noticed is they're just bad, like the, the platform itself is just a bad way to express yourself.

[1425] Absolutely.

[1426] Because you go, I went on one and I won't name it, but like you're in this waiting room and they join you in the way to, it's like this Zoom version of it.

[1427] And they go, hey, how's it going?

[1428] I'm good, I'm good.

[1429] Okay, we're on in five.

[1430] And then they ask you like this like three second question and they cut you off within like you give this like sound.

[1431] And you're aware like, okay, this is, it's live, but it's actually not live.

[1432] Like they're going to release it later.

[1433] So I'm like, why can't I really think about my answer?

[1434] Right.

[1435] But it's given with this perspective like, okay, you have, you know, you have this 30 second answer.

[1436] And then they respond.

[1437] And then before you can even respond, they cut to a new segment.

[1438] And so I'm like, you can't even get in.

[1439] into the meat or nuance of the argument, it's the format literally constrains your ability to tell the truth, the whole truth.

[1440] And so one of the things that I think has been so just unlocking about YouTube is like, I just released a story and it was about a 30 minute story.

[1441] So you know how long it was?

[1442] It was 30 minutes.

[1443] When I have a 10 minute story, it's a 10 minute story.

[1444] When I have a 50 minute story, that is such an underrated, like just format shift to where you are able to tell the truth in the size that it is.

[1445] Yes.

[1446] And I think that's the problem now is, or the problem with mainstream media that's like, it's a challenge is they're stuck in an old format.

[1447] Yeah, and it's unfixable because they're connected to advertising.

[1448] So they have to go to commercial every X amount of minutes.

[1449] And that's not going to change.

[1450] Yeah, and you need the in and you need the out.

[1451] And they also have a time, they have a time spot.

[1452] So their time slot is, you know, 8 p .m. to 9 p .m. That's it.

[1453] So there's many subjects that are deeply nuanced, and you can't cover them in 60 minutes.

[1454] And you don't get 60 minutes anyway.

[1455] You get 44 with commercials, or maybe even less, depending on the show.

[1456] That, you're fucked.

[1457] You're fucked.

[1458] Because, like, it must be incredibly frustrating for someone who exists in mainstream media to see a person like you go into a deep dive, and then they'll look at the video and like, this motherfucker got 3 million views.

[1459] Like, this is crazy.

[1460] You know, my stupid fucking show on what network gets, you know, if you're, you're, lucky a couple of hundred thousand and that in the key demographic what is it like 40 50 thousand and these are like big shows and that's hilarious but also it's it's great for you it's great for me and it also shows that people have this perception that because short attention span formats like TikTok work they're very effective that that's the only thing people want to consume that's not true it's not true I think it's actually kind of like splitting into two things where you have like, hey, I have a break, I'm going to watch something short, or, hey, I'm like, you know, I'm going to go do something.

[1461] Let me put on a show.

[1462] Let me like, let me learn while I'm, that's become hugely popular.

[1463] It's like, hey, I'm setting up something in my office.

[1464] Let me turn something on and learn something.

[1465] While I'm doing it.

[1466] While you're cleaning your office, you're actually absorbing something.

[1467] Exactly.

[1468] Exactly.

[1469] I'm like sitting with, basically sitting in the room with an expert as he describes some topic that I'm interested But then there's a problem that, what if that guy's full of shit?

[1470] What if that guy's full of shit?

[1471] And there's no fact checkers.

[1472] So there's no one checking.

[1473] And who facts checks the fact check?

[1474] Right.

[1475] I mean, it's problems all the way down.

[1476] But I think like the thing that I worry about the most is that, you know, we have to have some commonality.

[1477] And so, you know, I think I think why I like spending time on things that unite people is like, I'm like, all right, my show, you can agree.

[1478] with no matter what.

[1479] Like, or you can watch it and you can disagree with it, but like it doesn't, you're not divided.

[1480] Yeah, you're not divided by, you know, your interests either way.

[1481] And so I think it's such a right moment for journalists to do more than play the game of battles.

[1482] But I don't think they can in mainstream media.

[1483] That's why it's so interesting.

[1484] And that's why independent media has a huge advantage.

[1485] Do you think, do you think like, don't you think like 60 minutes has done a pretty good job?

[1486] They only have 60 minutes.

[1487] Yeah, yeah, yeah.

[1488] And they don't even have 60 minutes?

[1489] But don't they do like real stories, not just like, like, partisan, you know, whatever.

[1490] They'll do a little bit of the politics, but they actually like, they'll, and Vice was doing that for a long time.

[1491] They did these incredible, like, documentary.

[1492] Like, that's journalism at its best where you're just like, you're just deep diving a topic that just people find interesting.

[1493] You go somewhere, you see, you talk to people and you go, you present the facts, but you don't go in with this pre like pre thing of, okay, I know what happened and let me tell every, like I'm just going to, you know.

[1494] But vice is a good example of what's wrong because like they were that and then they got bought and then the people bought it like, yeah, you got a great thing going on, but we're going to fuck that up and we're going to turn into this like woke fucking platform, this weirdo platform.

[1495] And that's what it is now.

[1496] It's like you can kind of guess what their angle is going to be before they even write the story.

[1497] I still, I will say there have been a few good vice pieces.

[1498] Oh, for sure.

[1499] But I know, I know what you mean.

[1500] Like, it's like, it's really challenging.

[1501] I really try to be as charitable as I can because I know, like, a lot of these journalists are working within these horrible constraints of like, you know, they want to do investigative work.

[1502] You know, one of the dirty secrets of the journalism game is that, invest.

[1503] Investigative journalism is the loss leader for every single news agency.

[1504] They're all losing money on investigative journalism and they want to do as little of it as possible for the bottom line.

[1505] Because it's expensive, man, to go send out a guy to really do the work.

[1506] You know what's easy?

[1507] Putting on a commentator and, you know, I can just pull up a bunch of articles all day and I'll just talk my talking points about those articles.

[1508] Like that's the profitable side of things because it's quick.

[1509] you can churn out clips and at the end of the day you can use the findings of investigative journalists and you just put them on your show and you go hey man like i heard you found this and they spent like three months on it and you spend like 20 minutes and you get double the views because people they know you because you're on tv all the time or you're on or you're on the internet all the time and so that's one of the real challenges i know journalists want to do that investigative work but they have editors yes and they have people telling them hey we judge you by the number of clicks you get on our site.

[1510] We're a click site or we're a subscription site.

[1511] So you've got to cater to the kinds of people who subscribe to us.

[1512] If we're the New York Times, we have a certain type of people who subscribe to us.

[1513] If we're, I don't know what the equivalent is, the New York Post or whatever.

[1514] We have a type of person.

[1515] So I think the New York Post is more, New York Post is more advertising.

[1516] But the point is is that these journalists are kind of like sent out on these mandates.

[1517] Rather than go find the truth, that's what they want to do.

[1518] But it's like, instead it's you know, you're battling for attention in a click world where you're not even controlling your traffic.

[1519] The social media company is controlling your traffic.

[1520] And it's about how many likes you get on Twitter and how many retweets you get on Twitter.

[1521] Yeah.

[1522] Yeah.

[1523] They're trapped.

[1524] And it's not good.

[1525] But the thing about independent journalists is that it's like they're not going to send someone to Turkey to investigate something.

[1526] They don't have the money.

[1527] You know, they also don't have people that they can just send out.

[1528] And that was one of the cool things about Vice is they did do that.

[1529] And back in the day, they would send someone to the front lines of some foreign war.

[1530] And, you know, you see some fucking journalist with glasses on with a flack jacket on.

[1531] Isn't that crazy?

[1532] It was wild.

[1533] Vice was wild in the beginning, you know?

[1534] And I'm good friends with Chain Smith, and I was friends with him in the early days when all that was going down.

[1535] It's fascinating to see what they did.

[1536] But he's, sold it.

[1537] Yeah, yeah.

[1538] Well, here's the thing too.

[1539] Independent media journalists, after a certain size they can do it.

[1540] The problem is they realize like for clicks, it's like, hey, I should just stay in my room and make 20 videos instead of going out.

[1541] And so that's why I'm a big believer in like subscription models for independent media journalists.

[1542] Like substack.

[1543] To get away.

[1544] Yeah, yeah, I do like the YouTube equivalent of like Patreon.

[1545] And it's like that is a way for me to free myself from like the view model which I did for a long time whereas like it was just it was about views and so eventually I was like man I really want to deep dive something and I don't want to be limited to like do I think this is a popular a popular thing so that was a big change and I think yeah things like substack it really frees up people and I think as we learn to like pay for journalism I think that's a that's a big thing because it's not free we got the false impression that it was free from years of just being able to go on like Google News or whatever and sorting through meanwhile the quality of journalism was just dropping like a rock as everyone moved to this ad model yeah yeah to digital it's just there's no money in it I mean and and the money is in just the mass production of just slop yeah yeah I don't envy them it's not good but it is great for someone like you yeah it's great for us yeah it were it really is yeah I mean I mean, especially for having long -form conversations.

[1546] What I found is that any time a model breaks, it gives you the chance to restart.

[1547] So you just described the kind of the problems with the models of mainstream journalism that allowed for an opening, because people are thirsty for like real conversations.

[1548] Yeah.

[1549] And so this podcast can go on as long as it goes on for, and we can clarify anything, we can do.

[1550] But this, if there wasn't problems in the previous generation, there might not have been that opportunity for you to, you know, get big, basically doing what you do.

[1551] Well, this thing didn't exist before.

[1552] It only, the only form that you had that's similar to this was radio and, you know, morning radio where, I mean, this is literally where I came up with the idea to do this was being on the Opie and Anthony show.

[1553] Really?

[1554] Yeah, and being on the Howard Stern show where you would go, wow, I'd like to have one of these things.

[1555] We just have fun with people and sit around and shoot the shit.

[1556] It would be great.

[1557] But no one was going to give me a show like that.

[1558] And they certainly were going to give me a show where one day I'm going to interview a UFO expert.

[1559] The next day it's a psychologist.

[1560] The next day it's an athlete.

[1561] And it's just whoever I'm interested in.

[1562] And I wouldn't, no one would say, yeah, interview whoever you're interested in.

[1563] Here's some money.

[1564] You'd have to create it on your own, which is I did.

[1565] But I didn't do it for profit.

[1566] I did it because I thought it'd be fun to do.

[1567] That's literally how I started doing.

[1568] and then it became this thing.

[1569] But I kept it the way I started it, where I'm only, like, I got interested in you by watching your videos.

[1570] I got interested.

[1571] I'm like, oh, this is fascinating.

[1572] Oh, this guy's clarifying this stuff.

[1573] I was wondering why.

[1574] Oh, okay.

[1575] And then here we are talking.

[1576] Like, it's that simple.

[1577] And I reached out to you.

[1578] It's like me to you, and then we're here.

[1579] There's no other people.

[1580] It's crazy to think of, you know, I kind of grew up in this.

[1581] I'm only 28.

[1582] So I kind of like grew up in like as it was shifted as everything was shifting underneath people's feet and it's interesting to watch like I am very fortunate to have never had to deal with these middlemen and these people like and people have tried to like inject it but I got I got enough people who had been burned by that telling me like hey you don't you don't want to sell the show you don't want you don't want a middleman you don't want this guy saying he can get all these deals you do not want this guy he's just going to use you and he's he's going to inject himself for nothing.

[1583] You get nothing out.

[1584] And then your show becomes worse.

[1585] It becomes this different thing.

[1586] Yep.

[1587] But I was so fast, like that is my favorite and it's legitimately the most exciting part of independent media is for the first time, there's no business people telling people what to do.

[1588] There's no top line guy who's saying, hey, we'd really prefer it if you sold more ad spots or did more of this.

[1589] It's just you and the audience.

[1590] And that direct connection is special, and we've never really gotten to see it before.

[1591] And, yeah, I think that's a game changer.

[1592] Yeah, I know a lot of people that have podcasts that sold like half of their podcast, or they, you know, got into some sort of a deal with a management company, and the management company takes a percentage of the show.

[1593] And then all of a sudden, other people are on conference calls dictating guests and telling you to avoid certain subjects or don't have this person on or don't talk about that.

[1594] Or every time you talk about this, we, you know, if you get a, you know, a strike against you on YouTube, it's going to cost us.

[1595] Yeah, that's not, I mean, then you're back in the same trap that you were trying to avoid if you were trying to avoid that trap in the first place.

[1596] But a lot of people were not trying to avoid that trap.

[1597] They just started a thing.

[1598] And then along the way that thing became profitable and people recognized it was profitable.

[1599] And then they swooped in and tried to buy it.

[1600] And it's very tempting.

[1601] Someone comes along.

[1602] hey coffee zilla we've got x amount of money for you oh yeah and then you don't have to worry about money anymore oh don't you want to do that and then you're like okay okay well here now you have to interview this person it's all to promote some crappy crypto coin or something like that's what it always is imagine if that was you if you fell into that yeah no there's i mean there's been plenty of that people are always asking like hey will you promote this will you do that and it's just like why sell out like that it's just there's i think people just want to be a free of the of worrying about the future.

[1603] You know, if something comes along and now all of a sudden you don't have to worry, like they're going to throw X amount of dollars at you and now you're owned by this corporation so you don't have to think about who your guests are.

[1604] There's always catches, though.

[1605] Oh, for sure.

[1606] The thing, I think actually, though, the kind of, like the kind of day -to -day struggle of like, I got to kind of, like, you got to make something, I got to generate something useful is actually kind of good because it kind of makes you strive.

[1607] It kind of makes you push.

[1608] I really like, you know, I feel like I'm literally living a dream because I started making these YouTube videos.

[1609] Now I've got this like crazy set and, you know, I'm able to like learn all about cinematography and somehow I get paid for it.

[1610] And it's kind of this wild thing, but at no point that I have to ask for anyone's permission.

[1611] Yes.

[1612] Like that is like the like the nobody had to give me a chance.

[1613] Like, you kind of create your own chance in a weird way.

[1614] That's the beauty of YouTube.

[1615] You know, and I had a conversation with Russell, Brand, about this.

[1616] And I'm like, here's this guy who is a movie star.

[1617] Yeah.

[1618] This huge movie star decides, you know what?

[1619] I'm just going to have a camera pointing at me, and I'm going to rant and rave and have these comedic takes on social issues and issues in the news.

[1620] And it's become massively popular.

[1621] And I'm like, one of the things that's interesting is, like, you're doing it the way anyone can do it.

[1622] it.

[1623] Like, anyone can set up an iPhone and have a point at you, and you just start talking and then make a video.

[1624] And there's a lot of them out there.

[1625] Like, you're not doing anything different.

[1626] Right.

[1627] You know, what's fascinating is, you know, what was the biggest tell?

[1628] Like, when I felt like everything broke was when all the late night people had to go home for COVID.

[1629] Wasn't that crazy?

[1630] You got to see, like, they went from, you know, they're TV people.

[1631] And all of a sudden, what's on TV looks like a YouTube video.

[1632] And you go, oh, Oh my gosh.

[1633] You suck at this.

[1634] This whole time, like, I thought you guys were something.

[1635] Like, you're the same as me. Yes.

[1636] But like worse.

[1637] Way worse.

[1638] I'm doing it myself.

[1639] You have this whole team of people.

[1640] And it's like, this is all you can do.

[1641] And then you, it kind of breaks the illusion.

[1642] Like, it's like seeing someone run a four minute mile or whatever.

[1643] You're like, oh, I can do this show.

[1644] Like, I got this idea to like make this crazy set because I saw somebody do this like Ted talk about like, you know, I think their line was like, you can.

[1645] can't become Kanye in your living room.

[1646] Like, you got to make an environment that, like, speaks to what the show is.

[1647] It's kind of a weird thing now.

[1648] But some people do do it very popular.

[1649] They have a very popular show, and they do just do it from their living room.

[1650] And that's a different appeal, because that's, like, that's raw.

[1651] So there's an appeal to the raw, and then there's also an appeal to, like, you know, high production value.

[1652] And it's different things.

[1653] They both communicate a different, like, kind of appeal to the work.

[1654] but I was always obsessed with like there is no difference between YouTube and like Hollywood besides just a little bit of knowledge a little bit like insider like they kind of know tricks there's tricks of the trade they kind of have a little bit more money but I was like you can hack this together now yeah you can figure out ways to kind of like almost get to like a Netflix like so that's my that's my dream is to start pushing for like real like documentaries or many documentaries on YouTube that look like they could be on a Netflix or something like that.

[1655] But never go to Netflix.

[1656] Yeah.

[1657] Like never take the deal.

[1658] Like never go to the producer.

[1659] Just always be doing it yourself.

[1660] Yeah.

[1661] I think what you're saying about the late night things is so true because I remember watching them do monologues with no audience.

[1662] And I was like, who said okay to this?

[1663] Why are you doing this?

[1664] There's not a fucking chance in hell that this is funny or going to work and when you see those flat corny late night monologue jokes with no audience those are so fucking cringy and you're also you're dealing with a lot of these people that are not stand -up comics so they don't even really truly understand how to deliver it right like they don't have the chops what they're doing is just like reading off of a teleprompter so a bunch of really good joke writers wrote them some stuff and then they're playing to the audience and the audience is like laughing so they get this feedback and they know how to do that when they're just them and the camera you're in the void now you're you're in deep space there's no one around you and it's fucking wild to watch it is wild can you unlock like how different is that from like stand up i'm just like a casual viewer of like the like late nights i mean i know they say like applause but is that real like laughter and like are they like saying like hey clap oh there's someone who's doing this there's someone in front of the crowd there's a there's a warm up guy and generally the warm up guy's a failed comedian or a middling comedian who's just trying to make it and they're you know they're doing the warm up thing is like a side gig and there's people that are good at warm up and the problem of being good at warm up is it's a profitable job and it'll actually keep you from being good at stand up oh you're like stuck doing yeah so you get I've had friends that were stuck doing warm -up.

[1665] And then some of them quit and some of them didn't.

[1666] And the ones that didn't are fucked.

[1667] And because those shows aren't, they don't even exist anymore.

[1668] There's only a handful of those shows.

[1669] So like if you see, like, how many talk shows are there?

[1670] There's very few.

[1671] Yeah.

[1672] There's Colbert.

[1673] There's Jimmy Kimmel.

[1674] Fallon.

[1675] Fallon.

[1676] There's a few.

[1677] There's only a few.

[1678] Yeah.

[1679] And so, you know, they, they stand there and there's applause signs.

[1680] And then there's producers.

[1681] There's the one.

[1682] warm -up guy that's literally telling people the clock they're like okay explaining to the people okay when Jimmy comes out I want a big round of a applause let's practice this right now ladies and gentlemen Jimmy Fallon yeah and everybody they practice it yes yes yes they'll train the audience how to do it depending upon the set but I've seen them do that at different places and I had a friend who was a writer in the early days of Conan he's a buddy of mine who was a comic and I went to see one of the very very early Konans.

[1683] So this is like, I guess it was like the 90s or the early 2000s?

[1684] No, it had to be the 90s.

[1685] It was the 90s.

[1686] And they were reading their banter between Conan and who's the other guy, Andy Richter?

[1687] Oh, Andy Richter, yeah.

[1688] They were reading off of cue cards.

[1689] So they had a giant cue card.

[1690] The banter was fake.

[1691] So the banter, their dialogue back and forth was scripted.

[1692] So they were saying, so Andy, you know, I understand he got married.

[1693] And so they're reading it.

[1694] And I'm watching the cars.

[1695] Like, this is madness.

[1696] Who approved this?

[1697] And it was terrible.

[1698] The early days of Conan, like that sort of banter was fucking.

[1699] The thing about Conan is like he's this funny guy.

[1700] He was a funny writer.

[1701] He's a really smart guy.

[1702] And he had to figure out how to do the talk shows.

[1703] Yeah, he figured it out.

[1704] He figured it out.

[1705] But in the beginning, it was awful.

[1706] And I watched like the audience is being cheered on.

[1707] There's literal applause signs that flash to tell you when to applaud.

[1708] And they're like, we'll be right back.

[1709] Yay.

[1710] And everybody claps.

[1711] Is everybody really clapping that you're going to be right back?

[1712] Nobody gives a fuck if you're going to be right back.

[1713] I feel like they didn't know that, though.

[1714] Like, I feel like media literacy has kind of gone through the roof.

[1715] Like so many people, I guess it's maybe everyone has cameras now.

[1716] So everyone's like sort of mini producers now of their own show.

[1717] Sure.

[1718] And so they get it now.

[1719] And so all of a sudden the craving for authenticity gets so much higher because is now you're aware of like what a teleprompter is.

[1720] Everybody sort of like, even though I didn't know how exactly I worked, I kind of was vaguely aware that like they kind of told you to applaud.

[1721] And like, and the laugh tracks in, you know, sitcoms were just canned laughter.

[1722] So I feel like as people realize the fakery, there's a craving for like, hey, can you do this for real?

[1723] Like, can you not, you know, I remember when I found out like, none of the conversations were real.

[1724] I was like, what?

[1725] Right.

[1726] What do you mean that's not real?

[1727] like they all because it's all a pretend like they're all they're all pretending that you really knew about my funny boat story and like i had this quippy you know and i thought wow they're so charismatic and you can't find out they've been rehearsing the story yes they go over with a producer on the phone and it's it's completely insane when you realize that you go like oh it's all fake and it's and the illusion is sort of gone and so now i think one of the surprising things but also maybe obvious in hindsight things was why shows with no laugh track less less productive are more engaging is because there's more of a realization of oh there are there isn't like games here there's just two people talking they haven't rehearsed their lines and i mean i came on here you like there was no production notes there was no like hey we want to talk about this it was just like hey you want to come on and that's all it is and so i think i think we didn't even discuss what we're going to talk about no which is what i do with everybody i just have them come in and talk yeah well it's fast it's fascinating and I think that's partly to do with like why people enjoy the show is that they know it's not like tricks and gimmicks I I wonder there's like it's funny to me as I'm thinking about it I'm like there's sort of like this like the world is accelerating in two directions towards like authenticity and then like with all the beauty filters and like the fake AI voices it's like you can fake reality but we also crave reality at the same time yeah for sure yeah um People are craving real human experiences.

[1728] And if you watch those late night shows, you never feel like you know that person.

[1729] You never feel like you're there.

[1730] But if you're just talking and you and I are just talking, someone is like on their iPhone or whatever they're doing, they're a fly on the wall.

[1731] Yeah.

[1732] They're here in a weird way.

[1733] I always thought that like live streaming your whole life would become big.

[1734] Well, that was the Truman Show, right?

[1735] Yeah, no, but I thought we would see it.

[1736] Like, I guess you see it with Twitch streamers who, like, stream like 12 hours a day.

[1737] But I kind of thought what would take off as a, I was kind of surprised it didn't, was like, you just watch my whole life.

[1738] Like, like, some people did do that for a while, right?

[1739] Yeah, I was kind of surprised that didn't, because I thought, I thought like eventually you'd have celebrities who their whole life would be on display and like the authenticity of just sitting in a room with somebody with, It just, it's quiet.

[1740] I think people just got too weirded out by that.

[1741] But wasn't there a, there was a movie.

[1742] I forget who was in the movie.

[1743] But there was a movie where someone had their whole life filmed, and at the end, they rejected it and decided.

[1744] That's Truman Show for sure.

[1745] But Truman Show was like, that was the Jim Carrey movie, right?

[1746] Yeah, yeah.

[1747] But that was fake, right?

[1748] Like, he didn't know.

[1749] He didn't know they were filming his whole life and he rejects it.

[1750] There was another one where the person became famous because they followed them around with cameras everywhere.

[1751] And at the end of it, like, he fell in love with the girl or something, and it was over.

[1752] You know, there's always some corny fucking reason why he cancels it.

[1753] Do you remember, you know what I'm talking about, Jamie?

[1754] Yeah, 100%.

[1755] I'm trying to figure it out.

[1756] I thought, for some reason, Matthew McConae was in, but I don't think that was it.

[1757] Maybe it was Ethan Hawke or someone, like some famous person, but they...

[1758] Like something TV.

[1759] Yeah, Ed TV?

[1760] Yeah, there you go.

[1761] That's it.

[1762] Who was it?

[1763] Who was Ed TV?

[1764] But it was that was kind of the premise of the film.

[1765] It was Matthew McConaughey.

[1766] It was Matthew McConaughey.

[1767] Yeah, 1999.

[1768] 99.

[1769] So in that movie, he, like, gives up on everything after a while, right?

[1770] Oh, wow.

[1771] Yeah.

[1772] See, the can't.

[1773] That was it.

[1774] You're live on Ed TV.

[1775] So that was him, just a regular guy who became famous, living his regular life.

[1776] How crazy is it that?

[1777] That was 99.

[1778] It is crazy.

[1779] And they kind of predict, I mean, that was Justin TV for a while.

[1780] How does she, she's still so hot?

[1781] How is she doing that?

[1782] The fuck is she taken.

[1783] She pulled it off.

[1784] But it's, that was the thing.

[1785] thing is like that this would be bad and they were sort of like saying no one wants this like imagine if you got famous this way what a disaster meanwhile then you have social media influencers who are you know every single aspect of their life they're live streaming they're putting it on camera this what justin tv started as yeah eight years later though he he literally attached a web camp yeah that's right to his baseball cap i've talked about it was really interesting he was the first time we live streamed we live streamed on justin tv in the green room of comedy clubs so what we'd do is like my buddy redband brian redband uh we would go on the road together and uh we just we thought it be funny to just like live stream while we were there in a green room yeah and so we just did that just for fucking around and it was just totally like yeah there that's us in the green room it's still there that's hilarious it's on because it's oh my yeah I think that's the Hollywood improv right is that one of them was that the other one was Pasadena it says that's back in my full beard days so we did that before the podcast itself and just for fun and so there was like all these different versions of it that I tried out and where I was thinking like there's got to be a way to do something where I don't have to go to someone and say hey can give me a show and then when I saw Tom Green's show that was what there was two things that gave me a big big the big idea one of them was anthony kumia from opi and anthony he did this thing called live from the compound where he had his house set up with a green room in his basement and anthony's a psycho so he was like singing karaoke while holding a machine gun it was like it was so crazy because he had all this money right he's very wealthy so he had like a full like production set like he built a set in his basement.

[1786] And I was like, this is wild.

[1787] He can just do it.

[1788] But he was already on this opium Anthony show.

[1789] And so he decided for fun with his friends.

[1790] Like he had, you know, a fucking, like a full bar down there with like Guinness on tap and they were just drinking and being ridiculous and he was doing a talk show and just having fun.

[1791] Just being silly with his friends.

[1792] And I was like, I could do that.

[1793] And so we started doing something like that with a laptop.

[1794] And when I went to Tom Green's house, Tom Green had turned his home into a television studio and it was on the internet and this was 2007 somewhere around then and so he had like these fucking cables running through his living room and then he had a server room and everything like that and he takes me in this tour like this is wild and there's a video of me sitting next to Tom Green because he had it set up just like a regular talk show where he had a desk like Johnny Carson and he was sitting there and he had screens, and this is me explaining why I think this is going to be the future.

[1795] For sure, they'd be assholes.

[1796] There's no super cool hecklers.

[1797] They don't exist.

[1798] No, this is not about that, but this is, that's me, but there is one video of me, like, figuring it out.

[1799] Yeah, that's like a live stream show.

[1800] Yes, that's it.

[1801] So this is like, awesome.

[1802] Thank you, man. This is the craziest thing ever.

[1803] It really is different, you know, than television or anything.

[1804] This is way better.

[1805] It's like radio, but it's like television.

[1806] And the genre is different because we can sit here and ramp.

[1807] You know, there isn't that time constraint.

[1808] You know, isn't that pressure?

[1809] I mean, you know, we want to keep it moving.

[1810] Well, not only that.

[1811] There's not a corporate pressure.

[1812] You can't just express yourself because you're expressing yourself to someone who's selling advertising space.

[1813] That's crazy.

[1814] You just need to keep doing this.

[1815] You called this out.

[1816] We need to figure out how you make money from this.

[1817] Yeah.

[1818] I've got a lot of neat ideas I want to talk to about because I know you're like this computer thing.

[1819] Take a little bit from the big wigs, right?

[1820] Dude, this is, I mean.

[1821] They don't need to exist.

[1822] They're non -creative people.

[1823] We talked about this before the show.

[1824] They're non -creative people who are controlling creative things.

[1825] And they want to have their input.

[1826] Just abandon them.

[1827] Abandon ship.

[1828] Isn't that crazy?

[1829] Wow.

[1830] That...

[1831] Called it.

[1832] Yeah, you kind of did.

[1833] You know, I just realized why no one live streams their whole life.

[1834] I just realized it.

[1835] I remember people were trying.

[1836] And, like, they would go out and they'd call it IRL live streaming.

[1837] And you go to, like, the store.

[1838] You know what the problem was?

[1839] What?

[1840] People would swat you.

[1841] Oh.

[1842] So they'd, like, call in a bomb threat or something.

[1843] Oh, God.

[1844] So the problem is you get enough people watching live.

[1845] One of them is a psychopath.

[1846] Right.

[1847] Or they just want to, you know, they want to get attention.

[1848] Who knows why.

[1849] You know, Tim Poole has that problem.

[1850] He's been swatted.

[1851] Like, how many times has Tim Poole been swatted?

[1852] Multiple times.

[1853] Like, many times.

[1854] It's a real issue with him.

[1855] It's an issue with live streamers, though, because you get the reaction.

[1856] Right.

[1857] Because if I'm shooting a show and something happens, I'll never.

[1858] put it out you don't say anything right with a live show because it's you know it's just happening in the moment you get to see them put put their hands up and you know the whole nonsense and and then they get their little like you get mad about it stops the whole show and they know they had that impact of course yeah so that's what's bad about like that's one thing the other thing is like what is life then is life a performance are you are you are you are you are going to let that stop them though.

[1859] But are you capable of being so in the moment that you are just yourself, no matter what?

[1860] Even if cameras are on, you would behave and exist the same way you would if there's no cameras on.

[1861] No. I don't think most people would be capable of that.

[1862] I mean, I enjoy keeping my private life private, my public life public.

[1863] I think there's like, I think that's pretty normal.

[1864] And I think things get weird when everything's online, your family's online.

[1865] I've seen people who they put out everything.

[1866] They put out their kids.

[1867] And they do it for the clicks.

[1868] That's what's weird.

[1869] And you're also not asking the kids.

[1870] Your kids are going to get famous when they're babies, and then they don't have any say in it.

[1871] And then as they get older, people know them.

[1872] And then you run into all sorts of security issues because of that, too.

[1873] And it's not wise.

[1874] And there's a lot of people that don't think.

[1875] They just do it.

[1876] And, you know.

[1877] Well, it's also like an opportunity, like kids channels were big on YouTube.

[1878] where they were running these, like, sorry, family channels are what they called them because you'd watch the family together.

[1879] Yeah.

[1880] And then you'd get, like, your kids would like to watch their kids.

[1881] And people grew multi -million dollar brands on the back of that.

[1882] And it's like, by that point, it's too late to stop because you got a mortgage, you know, you're depending on that money coming in and say, you can't stop.

[1883] Your kid better get on a kid.

[1884] I just want to know, like, do you tell your kid, like, get on your mark?

[1885] Like, hey, can you react to that again?

[1886] Can you help me with this thumbnail?

[1887] Like, that's crazy.

[1888] And then if the kid becomes, famous when they're young they're in so much trouble there's very few people that ever survive being famous when they're young very few they all come out fucked up it's not a normal way to develop fame is a drug that you have to develop a tolerance for and if you don't develop that tolerance you actually develop with that drug like instead of like experiencing adversity instead of developing your personality you know like to like realize like what what what what what What is wrong with the way I communicate?

[1889] Why do people get mad at me?

[1890] Why are people, why do people like me?

[1891] Like, you sort it out as a human.

[1892] It's how you interact with the world.

[1893] It's why kids, you know, pick on each other and they're mean to each other.

[1894] And they're figuring out how to communicate and be social.

[1895] If you're five fucking years old and you're already famous, you're in deep shit.

[1896] And they're all in deep shit.

[1897] I've met quite a few of them now.

[1898] I've interviewed quite a few of them on this podcast.

[1899] I've met quite a few of them in real life.

[1900] and they're all fucked.

[1901] Everyone who becomes famous when they're a child is fucked.

[1902] I don't know.

[1903] I was going to ask, like, do we know of anybody?

[1904] Just navigating, like, mega fame in general, I don't think I've seen many people do it without kind of getting eaten a little bit.

[1905] Yeah, you get eaten a little bit.

[1906] You need to do something to mitigate that.

[1907] You need to do something real.

[1908] And if you do not do something real, then you're, like, the responses, you get, if that's what you're living for, and if your worth and your value is based on people's attention to you and people's interaction with you, that's not good.

[1909] It's very bad.

[1910] And that's why, I mean, also, like, how many of them are narcissists to begin with and how much of that narcissistic tendency gets fed by being famous?

[1911] It's just like I think with my, like, phone, I've sort of given myself some, like, low -grade ADHD.

[1912] I think too much of the like attention online makes you into a narcissist, even if you weren't one original.

[1913] It has the potential to do so.

[1914] If you don't actively mitigate it, one of the strangest things is like when you get hot online, everybody wants to be your friend.

[1915] All of a sudden, these people come around from the woodwork and all of a sudden everyone wants to be a friend.

[1916] And then when you're not hot again, now it's like you don't exist.

[1917] And that's a bad way to experience life, that your whole identity and your whole friendship base and everything's wrapped up with how you're doing online and like, I know for me at least, I'm, I try to just segment my life to where the online thing is online and all my real friends are just in my city, just like kind of regular people have different jobs.

[1918] I think it's kind of important to detach yourself so that when things aren't going well, it's fine.

[1919] When things are going well, it's fine there's like a stabilizing something i feel like you're describing hollywood you know you're describing the problems with hollywood in hollywood when you make it like if you're you're you're in a movie and you're you're doing well everybody loves you oh coffee zilla come on through the red carpet let's go coffee zilla's hot now we want to put them in this movie and they want to put them on this show we want to do this and then when you're not no one wants to talk to you doesn't that break you psychologically though of course that's why they're all crazy I mean, in Hollywood, it's even worse, right?

[1920] Because you don't get to choose your own destiny.

[1921] Like, you've developed your own show and you've created your own thing.

[1922] You haven't been chosen.

[1923] In Hollywood, the problem is you're being chosen for everything.

[1924] So you're being cast in these things.

[1925] So you have to deal with people that approve you or pick you.

[1926] So you're formulating your personality based on whatever the zeitgeist is, whatever the ideology of most of the producers are.

[1927] Like if all of Hollywood was right wing, right?

[1928] If all the producers and all the executives and all the studios were all very conservative and right wing, all actors would be conservative.

[1929] They would all be pro -life.

[1930] They would all be First Amendment, Second Amendment happy.

[1931] They would all carry guns.

[1932] It would be 100 % compliance the same way it is with left wing.

[1933] They're not necessarily people that think that way.

[1934] They think that way because that is the way to fit in and be successful.

[1935] So you take people that already have this exorbitant need for attention, and then you bring them into an environment where they have to be chosen.

[1936] So you have to figure out what gets me chosen.

[1937] So you form your ideas and opinions based on what's going to be the most successful.

[1938] It's a mating strategy.

[1939] It's weird because the fact you need to be chosen sort of makes you play the same game that you don't like, which is.

[1940] Because you have to go to the power brokers and you have to suck up to them, the same way people suck up to you when you're successful.

[1941] You have to go suck up to the successful people.

[1942] Yeah.

[1943] And now you're playing the same game where you're going to the people who are decision makers and you're trying to woo them and pretend you're their friend.

[1944] That's why when those people do make it and they do get pushed through that red carpet, come on through Tom Cruise.

[1945] They're all fucking crazy.

[1946] And a lot of them treat other people like shit because they want to let you know that they're a part of.

[1947] the chosen class so that's like this thing about certain celebrities being assholes to to regular people like why do they treat people like that well the same reason why royalty does it you know like when you see the queen you're supposed to bow like this is how it goes down that's why they became the queen in the first place that's why they became a star in the first place because they want to be that person that just gets fucking exorbitant amounts of love and attention it's and it's very unhealthy and it's good i think that it's now becoming possible that you can be like a mr beast or something and be not be in hollywood he's like in north carolina or whatever and he can just do his own thing he can start his own and he's this he's as big of a brand as anybody yep and it's like it just doesn't matter he's doing his he doesn't have to like kind of play the same games i think that is like sometimes i think changes in technology are like neutral like it's like kind of like you win some you lose some.

[1948] I think that is a distinct change for the better that we've kind of decentralized Hollywood a little bit.

[1949] It's like you can just start your own show.

[1950] You're not, we talked about being subject to the gatekeepers, but even subject to that like kind of mentality of like everything's about success and fame.

[1951] Yeah.

[1952] That's the currency of Hollywood.

[1953] But it's also the motivation.

[1954] Like what is the motivation to do it in the first place?

[1955] A lot of the people that are in Hollywood, their motivation is purely for attention.

[1956] Their motivation is purely to become successful and famous whereas his motivation seems to be to have fun and to do things with the money that is actually altruistic and good and beneficial and you know charitable he's a really good guy like that's one of the appeals and also there's no one like filtering him that's who he is that's that guy he's very smart and very ambitious but he's also not really money hungry and he dumps most of the money back into the production of a show he's legit i mean you know i have a lot of um a lot people you meet behind the scenes and they're like they're different you know it's like the same guy you meet and that's always a huge letdown i've had so many examples of that but like but he was one of the first like like not first but there are a lot of guys but the big the biggest stars i guess are the ones that are most like you're like ah you're a bit different but he was like when i when i met him we talked a bit and it's just like dude this guy's legit like he's yeah he's the real deal yeah he's that is who you get the the guy that you see that when he's doing those videos with his friends joking around and making them do stunts and pranks and all the different little games that he comes up with for where people can win money that's really who he is YouTube's lucky because it could be anybody like they don't select who is like on top and they're fortunate because it could just be like some like some super narcissistic like monster I don't know if it would work because it's so you're saying it's selected for like yeah because a super narcissistic monster I don't think would create something that's relatable that's a good point yeah How could they get something?

[1957] You had, you had, there was this huge YouTube channel.

[1958] I think they still might be the biggest, like T series or something.

[1959] There's some random corporation.

[1960] I think there are ways to growth hack it maybe, but, but you're right.

[1961] You wouldn't create such a brand.

[1962] Right.

[1963] You couldn't fake it forever.

[1964] You couldn't fake authenticity.

[1965] I don't think you can.

[1966] I think after a while it gets exposed and people realize you're full of shit.

[1967] Yeah, the longer you talk, especially on the longer you, if you're in little sound bites, you can pull it off for a little bit.

[1968] But the longer you talk, the more it gets shown.

[1969] Unless you just have amazing stamina for bullshit.

[1970] Have you ever talked to somebody like that?

[1971] I'm sure, right?

[1972] Like, statistically, there has to be someone who came on the show and you're like, oh, my gosh.

[1973] Like after like an hour?

[1974] That they're full shit?

[1975] Yeah.

[1976] Oh, yeah.

[1977] Okay.

[1978] Oh, yeah.

[1979] For sure.

[1980] There's quite a few people that I talk to that are full shit.

[1981] And it's unfortunate.

[1982] Like sometimes people like, you know, I've had people come on where I don't realize until like an hour and a half, two hours in.

[1983] And then I started asking certain questions, and then you realize, like, there's something fucking funny about your answers here.

[1984] Like, this is not.

[1985] And then we'll research them after the show.

[1986] Like, oh, good Lord.

[1987] You know.

[1988] It's an issue.

[1989] You know, it's like, and there's also people that just, you know, whatever their motivations are, they're not good.

[1990] You know, like, what is their motivation to do a show in the first place?

[1991] Like, is their motivation just to try to make the most amount of money?

[1992] or are they trying to do a good show?

[1993] Like, if you're trying to do a good show and you keep working at it, it'll get better.

[1994] Go watch my early shows.

[1995] They fucking suck.

[1996] You know, like, you get better if you're actually just trying to do it and get better at it.

[1997] But if your motivation is just to make money, like, somewhere along the line usually you slide off.

[1998] I think most people who, like, want to make money, just go into finance.

[1999] Yeah, but they also want attention.

[2000] Oh, right.

[2001] They want money and they want attention.

[2002] And then once you've gotten the attention, that's the thing about.

[2003] fame, right?

[2004] Like, if you go to a store and there's a security guard at that store, you don't think, look at this poor fuck, he's a security guarder store.

[2005] You just think he's a guy.

[2006] Like, hey, man, what's up?

[2007] How you doing?

[2008] You don't treat him badly.

[2009] But if you go there and it's Will Smith, Will Smith has lost all his money.

[2010] Now he's a security guard at the store.

[2011] Look at this fucking loser.

[2012] Let's go visit Will Smith.

[2013] Ah!

[2014] And you laugh at him.

[2015] How much you make here, Will?

[2016] What are you going to slap me to kick me out?

[2017] Yeah.

[2018] Yeah, yeah.

[2019] You would be free.

[2020] You would be free.

[2021] to do that.

[2022] And that was the case with Gary Coleman.

[2023] You remember Gary Coleman?

[2024] Oh, the...

[2025] Yeah, the tiny guy.

[2026] He used to have that show different strokes.

[2027] And he was famous on television, but then he, I don't know what happened, lost all his money, and he was a security guard at a studio.

[2028] And they hired him to be the guy that, like, when people drive through, they meet him, and then people realized he was there.

[2029] And this was, like, before social media.

[2030] So this was, like, early on.

[2031] And it was a real problem is because people would go there just to mock him and make fun of him.

[2032] Because Because someone who used to be famous and now is not is a loser.

[2033] But someone who's just never been famous is just a person.

[2034] It's very interesting.

[2035] But it's so weird because everyone who achieves any level of notoriety knows how temporary.

[2036] More than even the audience, they know that there's a shelf life on everything.

[2037] Very few people make it an entire career.

[2038] Like, I'm always thinking, like, it's going to be over.

[2039] like you know next month right because it just that is the nature of especially online fame is even more fleeting than the old days of movie stars it's like it's even less so i don't understand why that is it feels like i don't i don't really buy into that i mean i think it's just people are people and and you have your moment in in like the spotlight for one reason or another it's usually not about who you are it's just you're saying something of the time that resonates and things don't resonate forever.

[2040] Right.

[2041] But think of your perspective and where you're coming from.

[2042] You're a 28 -year -old guy who is doing really well right now.

[2043] So you are in the spotlight.

[2044] And you haven't had a lot of time outside of it.

[2045] I mean, how old were you when you started your show?

[2046] I think I got actual attention maybe 26, 25, 26?

[2047] Yeah.

[2048] So for the last three years.

[2049] So you didn't go through like this long, terrible period of fucking hating life.

[2050] Darkness and depressed.

[2051] Yeah.

[2052] So a lot of people, fucking hate life and they look at someone who is a movie star or a television star like Gary Coleman and they go wow how the fuck that guy how's he doing it how's he got a he's got a fucking Ferrari he's got a this he's got a that and then when they don't have it like ha ha you're less than one of them you're less than a normal person because you're a person that used to be free of it you're a person that used to be we love a story of some movie star that spent all their money and now they're broken crazy i remember who's a woman was it margot kidder is that her name the woman from superman um there was a woman who was she played lois lane in the early supermans with christopher reeve and she went crazy and like she lost all of her teeth and she was like someone found her in the bushes somewhere like it was like real sad like real mental illness problems and i remember there's this deep fascination with this person who was a movie star at one point in time and then it had completely fallen apart like what was the story with her do you remember the story with her jamie you nailed as much as i remembered yeah yeah something happened she had some sort of a mental health breakdown and i'm sure some of that had to do with fame and society and acting and just the the world that they live in of the movie star and then also the women's world of a movie star which is a fucking much more brutal world because you know i was talking about a little bit Hurley.

[2053] She's the rare, the rare that stays hot.

[2054] She's hot and she's like fucking 80 years old.

[2055] She had an accident, I guess, that left her paralyzed out.

[2056] Oh my gosh.

[2057] She lost some money and had some issues with.

[2058] Oh, boy.

[2059] This, I will say, like, one of the challenges is you kind of have to you don't know how long you're going to stay relevant and then if you don't make the money then, now what do you do?

[2060] I guess is the point.

[2061] And then And so if you haven't set yourself up, then I guess, and a lot of these people, they think it's going to be lasting forever because their agents tell them it's going to last forever.

[2062] So they just spend all their money.

[2063] And then especially if they don't pick up any, you know, skills, one of the things like with actors is all you learn is acting.

[2064] I mean, one of the interesting things now, which is kind of fascinating about like modern, like, you know, people who grew up on TikTok and like the YouTube era is you kind of have to learn like.

[2065] marketing you have to learn video editing you have to learn so you can pick up skills to where you're never going to be completely you know I know a lot of YouTubers who now work for other YouTubers because they like they stopped being relevant but they're like I understand content I understand how this stuff works they're not just a face they're not just like a pretty face they have actual tangible skills beyond that that is an issue though I mean I can I can understand why that's a problem I think here's a big issue with someone like yourself What if YouTube goes away?

[2066] That's the real issue.

[2067] Sure.

[2068] If you're relegated to one platform and that platform, what if the platform decides for whatever strange reason?

[2069] Like, what if they get pressure from someone who you've outed?

[2070] Right.

[2071] And they come up with some bogus reason to strike your account and delete your account.

[2072] That's a real issue.

[2073] Yeah.

[2074] Like, if you're beholden to one company, that can be a real problem.

[2075] It is a huge problem.

[2076] Right now, the number one video sharing site in the world, is basically YouTube and that's essentially it.

[2077] I mean, there's not really anyone else.

[2078] There's like alternatives like Rumble, but, you know, like that's kind of, I don't know why.

[2079] It's perceived as kind of like a right wing thing.

[2080] So it's like.

[2081] Sort of, but then, you know, Russell Brands on it and, you know, Glenn Greenwald's on it.

[2082] Let's say political commentary thing.

[2083] I mean, I don't know if like mainstream, like just like random creators are doing really well.

[2084] I don't know.

[2085] maybe it's there maybe it's not i think i think you're pointing out though a very good point which is like i as much as we talk about the decentralization of gatekeepers there is one gatekeeper to rule them all still for someone like me yeah that is youtube i mean i would like to think that you know throughout you learn enough about making stuff making content that you could move i would probably try to transition into some like production role yeah uh start a production company But I think you would still enjoy doing the thing you do.

[2086] You'd have to figure out a way to do it somewhere else.

[2087] But also you'd have to figure out a way to bring.

[2088] Like here's the other problem, social media, right?

[2089] Social media is where you use to promote the thing that you're doing on YouTube.

[2090] So what if that goes away?

[2091] What if something, like we have to assume that if Twitter was on the verge of bankruptcy, apparently, when Elon bought it?

[2092] Sure.

[2093] It was fast -tracking to bankruptcy.

[2094] What if someone incompetent bought it?

[2095] ran it into the ground, then it doesn't exist anymore.

[2096] Then all those people that use Twitter to promote their businesses, stand -up comedians that use it to promote their tour dates, like, they're fucked now.

[2097] It's gone.

[2098] Now you don't have that vehicle, and so your ability to access your fans is completely gone.

[2099] Yeah, you don't own any of your data.

[2100] So you don't own any of your subscribers' data.

[2101] You don't own any of that stuff yourself.

[2102] It's a real challenge.

[2103] I mean, one of the things, they also control what you can talk about.

[2104] So when I was doing, you know, my first show, I kind of had this, I had this video where I wanted to explore smoking and like, and like vapes through the lens of the FDA and how they regulated vaping and they sort of went after vaping.

[2105] But, you know, it's a problem, but it's also like seems like it's a lot healthier than like just smoking cigarettes are like the worst thing in the world for any human to be doing.

[2106] Although, you know, it's very fun.

[2107] But they're horrible for you.

[2108] And so I did a video about that.

[2109] YouTube like age gated it.

[2110] So now not only no monetization, which that, you know, it's acceptable.

[2111] It's just kind of the cost of being on YouTube.

[2112] You sometimes get demonetized, whatever.

[2113] The reach was killed.

[2114] So now this video, which everyone loved, nobody can watch.

[2115] Or you won't get recommended, like, you know, the recommended feed.

[2116] There's also a problem that now you're in a specific category.

[2117] Like, I don't know how their algorithm works.

[2118] But if you do get flagged for something, you could get put in a problem.

[2119] emblematic category.

[2120] Right.

[2121] You're at this shadow band or less likely to be recommended.

[2122] Right.

[2123] So, um, so I think especially, I think they say their official stance is they do it on a video by video basis.

[2124] I don't actually know.

[2125] I mean, it's kind of hard to figure out, you know, what's true, what's not.

[2126] But, but I will say like, did I ever do a video about that again?

[2127] No. Yeah.

[2128] Yeah.

[2129] Yeah.

[2130] And that's what happens to people.

[2131] That's a big problem.

[2132] That happened during COVID with a lot of people, you know, people wanted to talk about issues like the lab, hypothesis.

[2133] Usually they're important issues too.

[2134] That's the problem.

[2135] They're controversial, so they are important.

[2136] Right.

[2137] But it's like, you know, I understand YouTube's perspective.

[2138] They have, they have, I don't know how many, maybe they're supporting hundreds of thousands of people's livelihood.

[2139] And they're like, do we want to risk it all on so somebody can say some wild stuff?

[2140] Right.

[2141] And then the advertisers just pull out.

[2142] They lose X percentage of the revenue.

[2143] and then whoever that producer is that allowed that channel to exist, now that person gets fired and, you know, their success in this company is based on whether or not the company is bringing in revenue.

[2144] And if you're allowing all these people to say things that are really terrible to the bottom line of whoever is paying money for advertising, that's not good.

[2145] What I've said is, like, I think a lot of these, you know, some of these companies, they achieve near monopoly statuses.

[2146] It's hard to argue that some of these companies aren't close to a monopoly in their specific, like, domain that they're good at.

[2147] Because, you know, if you're going to make a replica of YouTube, you've seen how hard it is with Rumble.

[2148] Because it's not like you're just video sharing.

[2149] It's like you're video sharing.

[2150] Their AI, their copyright idea.

[2151] I think they said they spent like $10 million or $100 million to build the copyright idea.

[2152] So if you want to compete with them, you need to have it.

[2153] at least that, just to build a copyright ID system on par, then you got to go host all the video.

[2154] You got to find the ad words targeting.

[2155] Google is the best ad targeting in the world.

[2156] They're not going to give you access to their system if you're a competitor.

[2157] They're not going to give you the same deal.

[2158] So it's like this challenge of, okay, who can really compete when there's such a high barrier to entry?

[2159] So I'm thinking like, why are these things not considered some sort of public good in that because we accept that it's.

[2160] so hard to compete meaningfully with these things that are so important to our public discourse.

[2161] I understand the whole argument of like free speech is just freedom to speak against the government, not freedom from a corporation.

[2162] But what I'm saying is when all our discourse is online, why are these companies not some form of like almost like a utility company?

[2163] Yes.

[2164] Like, yes, at some level, you don't have the right to monetize, but do you have the right to at least say something?

[2165] Yeah, that's a good point.

[2166] And that was the point about Twitter.

[2167] That was the conversation about Twitter being the town square and that it should be regulated like some sort of a utility.

[2168] And I could see that argument.

[2169] And also, when you think about the concept of free speech and the First Amendment, none of that existed with social media.

[2170] And they would have imagined trying to wrap your head around social media when they're drafting the Constitution with feathers.

[2171] They're literally writing with a fucking quill.

[2172] They had no idea what they were saying.

[2173] So they were just trying to get people to be able to discuss things without being restricted by the government to stifle tyranny.

[2174] Because at the time, the tyranny was government.

[2175] That's the only people who had the kind of power and oversight to where they could literally stop you from saying anything is a government.

[2176] Now it's like, okay, you want to say something.

[2177] The person who's going to stop you from saying it is probably not the government.

[2178] It's probably like some like random tech executive.

[2179] Yeah, random tech executive who has an ideological bias.

[2180] Unelected from a, yeah, exactly.

[2181] It's this strange thing.

[2182] And I think it's actually a very like, it should be a universal issue because I think conservatives all don't want to be censored.

[2183] And that's usually you get censored.

[2184] But left wing people are all about decentralized power.

[2185] I mean, that's like the idea is like democracy, more elected, not just like these unelected people, but get more of a like kind of.

[2186] a group say in powerful decisions well then they also should have a problem with the decisions even though they happen to kind of go a certain way still being made by unelected people who just can have arbitrary you know biases like that's the thing it's like one day twitter's owned by um i forgot the last who was the the health of the leader of health and safety or whatever at twitter um oh Vigia?

[2187] Yeah, yeah, yeah.

[2188] One day it's her, the next day it's like Elon Musk.

[2189] Right.

[2190] And they have different, like, opinions on things.

[2191] Yeah.

[2192] And so do you want to be subject to like both of their whims?

[2193] Or do you want there to be some sort of thing on, you know, I don't know, on the books that we could at least sort of have a public vote on it?

[2194] Well, this is a narrative that's being bantered about now that Twitter's no longer safe from trolls.

[2195] But Twitter was never safe from trolls.

[2196] It's just they used to be just left wing trolls.

[2197] Now you get right.

[2198] wing trolls too it's it's it's it's more of a center it's not it's like the idea that Twitter leans right now it said no it doesn't like how many left -wing people that are addicted to Twitter stayed on most of them a few like goofy celebrities like valiantly declared they're leaving Twitter and one of them was my friend I was like what the fuck are you doing like why are you posting that you're leaving like so goofy like and you don't even know what you're saying.

[2199] You're just saying this because you think this is going to be appealed to your base that you're so noble.

[2200] You're going to leave before the right -wing trolls come back.

[2201] You know, cut the fucking shit.

[2202] And the good thing about people being allowed to speak is that you allow them to put things out there that can be ridiculed by everybody.

[2203] And so if you really oppose these right -wing ideas, let them post them and then post something that ridicules them, post something that refutes them, post -facts them.

[2204] information get engaged if that's your thing you really like doing that I don't like doing that but if you like doing that get in there get in there and go to work it sounds like a huge I mean I I get exhausted I'm like just thinking about it I'm like who wants to spend their time like arguing with somebody like I don't know I guess it's just not something I care about so it's like to me that doesn't matter but I guess to some people this is their whole just like covering scams is my thing it's like this is their whole thing and I guess that's like it's like video games It becomes their game.

[2205] Right.

[2206] That's where they get their score.

[2207] They level up.

[2208] Yeah, they level up.

[2209] Get more followers.

[2210] Level up, get more likes.

[2211] You know, people will tell you about their engagement.

[2212] My engagement on Twitter is up.

[2213] How the fuck do you know?

[2214] I don't even know how many followers I have.

[2215] Why are you paying attention?

[2216] Get out of there.

[2217] Go outside.

[2218] Go do something.

[2219] I think it's deeply bad for health to constantly be given analytics.

[2220] Like, this is the thing on YouTube.

[2221] I was talking to Lex about this because he was telling me he doesn't like he likes to not look at his numbers and I was like man I love that I try not to look at my numbers the thing is when you go onto your dashboard like they give you every stat you could ever imagine and I get it they're trying to educate you on if a video is doing well or doing bad or whatever but I think it's kind of good for artists not to have immediate feedback like there's an argument against that though and that's Mr. Beast Well, no, Mr. Bees has figured out.

[2222] He money -balled it.

[2223] So he money -balled YouTube.

[2224] YouTube before that wasn't like a science.

[2225] It was like an art. It's like, nobody knew what they were doing.

[2226] He comes and he's like, you guys are all idiots.

[2227] Let's turn this into stats and numbers.

[2228] And I love him and I hate him for it.

[2229] Because I got the one perspective.

[2230] It's like, you kind of saw the Mr. Beastification of YouTube.

[2231] Everyone talks the same.

[2232] Everyone has a, hey guys, what's up?

[2233] Today we're doing this.

[2234] And that's like because he kind of showed like, oh, this is a pretty optimal way of doing it.

[2235] So it's good because he gave people like handles on their own success, which is valuable.

[2236] Like, it's cool that you know why a video does well or not.

[2237] There's also something that like it kind of kills a little bit of creativity and inspiration when all of a sudden you know like this segment ain't going to do it.

[2238] Right.

[2239] Like, and you, they give you this graph.

[2240] Have you ever seen the retention graph?

[2241] No. Oh, it's hilarious.

[2242] So you start off at 100 % and then you just see as people leave.

[2243] And then it goes to the end of the video and you see how many people left.

[2244] And like at every moment you can tell if someone clicked off at that moment.

[2245] People are going to fucking go so much anxiety for that.

[2246] And what they do now and like this is taught like at YouTube, you know, boot camps is like, look at your retention graph and everything that wasn't good.

[2247] If people clicked off, you got to cut it.

[2248] You got to stop.

[2249] And I think that creates its own like, you know, sickness.

[2250] Yeah.

[2251] YouTube boot camps are hilarious.

[2252] That's so funny that they have YouTube But it makes sense I mean if you wanted to treat it like a business Like any other business If you wanted to get involved And you know you wanted to open up a small business Somewhere You know you could treat YouTube like you're opening up a small business I get it I get it It's not my thing though So I don't I don't get that aspect I think that would fuck with what I do I think that would get in the way I think it would fuck with what you do too I don't I think it gets in the way more than it helps We've had to move away for a while We really emulated some like creators who we liked what they did but eventually what you realize is like I just have a different audience and I have a different people are here for different reasons and so I have to find my I can't just rely on a book or not literally a book but like the playbook of like what has worked for you I have to find out like you know not only what my audience wants but what do I want yes I think that's the most important thing we're not just making well I'm not making things for other people I'm making it because I think it's cool I think it's and I think it's valuable just for me to express it and so I have to find out like why do people watch my show what do I want for my show in a way that even if nobody wants it I put it in like I have this like this whole robot bartender thing and it's like the CGI thing and I do it because I like it yeah it's fun for me I get I get a real kick out of that stuff I'm like I'm a nerd when it comes to that CGI tech stuff and people wouldn't believe how much time I spend on that.

[2253] I spend like half my day.

[2254] I'm just like tweaking this stuff.

[2255] But that resonates with people.

[2256] That's one of the reasons why people like it.

[2257] I think when when you do something that you like, it's very obvious to the people that are paying attention.

[2258] I think that's part of the appeal of a lot of shows.

[2259] You know, I think that that's why it works.

[2260] I mean, I think that's one of the secrets to my success is that I only have on people that I'm actually interested in talking to.

[2261] So I'm engaged I'm not just bullshitting my way through Someone trying to promote some movie You know I'm actually engaged If I have someone on that's promoting a movie I'm interested in the movie I want to know what they're doing if it's a documentary Like I want to know like how did you go about doing this Like what's the process I'm actually engaged When you're faking it and phoning it in people know it They feel it You know and that's the beauty of your show is I think your show serves multiple purposes But one of the things is that it it certainly clearly appears to what your appeals to what you're interested in and you act as a watchdog like i watched the celsius video that you put out recently and i watched it today and i was like this is so valuable because i'm seeing all these people because you you showed those people that did get scammed and the people that get fucked over by this guy who created this thing and you know they have a voice now and you can you can also like let all these other motherfuckers that are trying to do something like that know that Coffee Zill is out there and he's going to find you and he's going to put you on blast and people are going to know and it's going to be more difficult for the next person and again it's not the wealthy investors that will sue it's these people that put in $2 ,000 and it's the only $2 ,000 they had that's where it's so valuable and I know that you feel that way and it comes through in your video and I think that's why it's appealing that's why it's working.

[2262] I really discovered early on that nobody cares about the numbers.

[2263] The numbers are like the headline or whatever, but ultimately you can't make a, like, this stuff doesn't matter until you get people involved.

[2264] Yeah.

[2265] Until you hear the victims talk.

[2266] They're the heartbeat of everything.

[2267] Because until you hear that, like, what's a billion dollar?

[2268] It's impossible to know.

[2269] Yeah.

[2270] And it's not.

[2271] And then, and then you watch the guy and you're like, this, who would fall for this?

[2272] You know, like, it's easy to get cynical if you just see the numbers and the guy who defrauded people.

[2273] Yeah.

[2274] The second you humanize it and you show a person and all of a sudden you see someone with all the same problems and you can just tell.

[2275] You can see it in their eyes and they're just wrecked by this guy who they believe, who truly they believed it.

[2276] It's like the biggest betrayal.

[2277] You trusted somebody with everything and then they stab you in the back.

[2278] Like this Alex Machinsky, CEO of Celsius, his whole thing was banks are evil, which is not.

[2279] Not crazy.

[2280] I mean, it's like, you know, you can understand why a lot of people resonated with that.

[2281] They're like, and it wasn't even they're evil.

[2282] They're heartless.

[2283] Yeah, yeah, yeah.

[2284] Sorry, I was going to correct that.

[2285] He said like they're greedy.

[2286] Yeah.

[2287] And that's true.

[2288] Like it's like, and he's not.

[2289] Yeah, yeah.

[2290] He goes like, banks are not your friends.

[2291] True statement, Alex.

[2292] And then this interviewer's like, but Alex is your friend?

[2293] And he's like, yeah.

[2294] He's like, you, basically, you can take the same ride as me, 8%.

[2295] 8 % a year.

[2296] I'll just give it to you.

[2297] You know, we're giving, we're doing the same thing as the banks, we're loaning out your money, but we're going to pass on 80 % of the revenue back to you instead of the banks, which they take all your money, right?

[2298] So people bought into that.

[2299] They said, that sounds great.

[2300] Like, hey, the internet changed everything.

[2301] You know, we think crypto's going to change everything.

[2302] Why not have a bank that instead of serving its shareholders, it serves its customers?

[2303] It kind of, like, there's something that makes sense there.

[2304] It's really compelling.

[2305] And then come to find out, Celsius was never making money.

[2306] They said they were paying out, you know, you with.

[2307] with their profits, they were paying out you with new deposits, like new people were coming in and they were paying you out.

[2308] And so it was this giant Ponzi scheme where they set the rewards because they knew if it's high enough, people are just going to flock to them.

[2309] And so, but they had this compelling explanation for why, like, it kind of made a little bit of sense.

[2310] And then they, when it finally goes wrong, he just gets, he just walks away.

[2311] I mean, yeah, he's getting sued civilly, but where's the criminal action?

[2312] He's going to go to jail?

[2313] Probably.

[2314] not.

[2315] And it's like, that is so messed up.

[2316] That is such a, that itself is a crime.

[2317] I think it's so sick that we allow, we throw the book at people who will rob a store with a gun, right?

[2318] Still 10 ,000 bucks.

[2319] People who steal millions, billions of dollars often get away with it because it's just done a little differently.

[2320] There's not the drama of the gun and somebody's, even if no one gets shot there's it's just hey it's just he pushed a few pencils around he got you to sign a few shady but that's just this sick and twisted but it's just done in a way that socially is slightly more acceptable and get they get away with it way more often but i would contend that these people are literally financially murdering people after celsius people committed suicide because of i mean literally it's a fact people committed suicide because they lost everything i'm sure ftx as well Of course.

[2321] You know, the bigger the scam, there's just statistically, it almost becomes impossible that you don't at least, if not financially, sort of metaphorically murdering a family, you literally kill somebody.

[2322] And people walk away with not like either only the guy at the top goes down or nobody goes down.

[2323] And that is crazy to me. It's like, what message are we sending via our regulators?

[2324] Basically, it's like, hey, you're going to get a slap on the wrist.

[2325] If you're caught.

[2326] And this, what you just did, is why you're so successful.

[2327] That's real.

[2328] This is how you really feel.

[2329] This is your, and this is why your show works.

[2330] This is, this is it right there.

[2331] Like, what you just did is why I'm interested in your show.

[2332] Because this is your real thoughts and opinions.

[2333] Something has to change.

[2334] I mean, something has to change where you can't, you can't just go on like this, where if, if we're really going to allow, if we're going to, you know, take our financial future in our own hands.

[2335] We're going to allow these influencers to talk about finance.

[2336] Somebody has to be there when things go wrong.

[2337] Yes.

[2338] And there has to be consequences.

[2339] If you lie and if you cheat and you steal, there has to be a guy at the end of the day who's going to put you in trouble.

[2340] And I think a YouTube video is not nearly enough.

[2341] It's why I'm constantly saying like, hey, can someone from the government get involved?

[2342] Like, go lock this guy up.

[2343] Go lock somebody.

[2344] You know, I know it's like a lot of this is new.

[2345] like the crypto stuff is new, but they're doing old crimes in a new way.

[2346] It's always been illegal to steal people's money.

[2347] And that is what's happening.

[2348] And that's why I put these people on my show.

[2349] So you don't think it's some like rugpole where it's all fake money.

[2350] No, there was real money in these companies.

[2351] And they just stole it a new way.

[2352] But they're still stealing money.

[2353] And the fact that we haven't found a way to put some of these people in jail is mind blowing to me. And we're sending a bad message that, hey, just keep doing it.

[2354] Just go start a new one.

[2355] There are people now, they were trying to start GTX after FTX.

[2356] Some new guys were trying to start GT, like the new thing.

[2357] And then HTX is next.

[2358] Jesus Christ.

[2359] It's just like, you know, you have to, that's half the purpose of the law is to, it's partly, you know, for, you know, you did something wrong, you get punished.

[2360] But also part of it is you do something wrong.

[2361] You send a message to socially, you socially signal that we do not tolerate this.

[2362] And right now the social signal we're sending and accepting is if you scam, there's a very high likelihood you will get away with it.

[2363] And if you don't get away with it, you'll get a little slap on the rest.

[2364] You'll get a little fine.

[2365] And that's not working.

[2366] No, it's not.

[2367] Hey, man, thanks for being here.

[2368] This is a lot of fun.

[2369] It's a lot of fun, Joe.

[2370] I appreciate your show.

[2371] I appreciate you.

[2372] I appreciate what you're doing.

[2373] And I really enjoyed this.

[2374] Thank you so much.

[2375] Thank you.

[2376] Tell everybody how to get your show, what you're so.

[2377] Social media is.

[2378] Coffee Zilla.

[2379] That's it.

[2380] That's the place to find.

[2381] Best place to find me. I appreciate you guys having me on.

[2382] This is surreal.

[2383] Been a big fan of the show.

[2384] Thank you.

[2385] Appreciate you.

[2386] All right.

[2387] Bye, everybody.