Conan O’Brien Needs A Friend XX
[0] Hey, everybody.
[1] Conan O 'Brien here.
[2] Just letting you know, we'll have a brand new episode of Conan O 'Brien Needs a friend on Monday.
[3] So stay tuned for that.
[4] But in the meantime, I wanted to share a Team Coco original scripted podcast called Smarter with you.
[5] Smarter follows an egomaniacal billionaire named Noah Lucas as he interviews the next generation of tech CEOs and gives them advice on how to grow without being hindered.
[6] by the dinosaur concepts of regulation, public opinion, or basic morality.
[7] It's really well done.
[8] The first episode, which will start after you stop hearing my voice, stars Tim Heideker as the CEO of a pool -sharing app with a massive algorithm problem.
[9] The rest of the series is available on the podcast platform, Luminary.
[10] So be sure to check out Luminary .com.
[11] For more smarter, which also stars Kate Burlant, Louis Anderson, Joel Kim Booster, and a lot more.
[12] Again, I think the series is very well done.
[13] You should check it out.
[14] So I hope you do.
[15] Enjoy.
[16] From Team Coco and Luminary Media, this is Smarter.
[17] I'm Noah Lucas.
[18] For two decades, I've led one of the most successful investment firms in Silicon Valley.
[19] Every day, billions of people use products I've helped bring to market.
[20] But I'm also just a regular guy who bathes in a 40 -degree slurry, yearns to die in space, and loves burgers.
[21] But something changed recently.
[22] People started asking questions.
[23] Who are these tech moguls who are reshaping our society?
[24] What are they doing with all that money?
[25] And should we kill them?
[26] In other news, a Delaware postal worker was injured by a mail bomb addressed to Silicon Valley billionaire Noah Lucas in an apparent assassination attempt.
[27] I was never in any danger, of course.
[28] My business is registered to a PO box in Delaware to avoid taxes, but it was still a wake -up call.
[29] And when I asked around, I realized I wasn't the only billionaire feeling victimized.
[30] What's happening to billionaires in this country can only be compared to Crystal Knotch.
[31] The system is rigged against billionaires.
[32] Why do you think there are no trillionaires?
[33] You can only shelter your home behind so many layers of bulletproof glass.
[34] And then they criticize me because my cleaning staff suffocated while I was at a Warriors game.
[35] I mean, please.
[36] Tech needs an advocate, someone to remind people of everything we've done for them, and to get them excited about what we're going to do to them next.
[37] That's why I started this podcast.
[38] To show you how tech and venture capital are working together to build your future.
[39] Every week, I'll profile one of the hottest new tech firms.
[40] Paying for sex is illegal.
[41] Paying for consent is innovation.
[42] The marketplace decides what news is true?
[43] Imagine a world in which extreme weather could simply be deleted.
[44] Some of our drivers killed their mother hundreds of times.
[45] It just is a revolution.
[46] Of course people are going to die.
[47] You'll come along with me as I hear their stories and advise them on how to nail down that billion dollar valuation.
[48] Then I'll decide if I want to invest.
[49] You're going to get the complete accelerator experience in podcast form.
[50] TLDR.
[51] I'm going to make you smarter.
[52] The question is, why can I pick any pool I want to and go swim in it?
[53] That is what a billion -dollar idea sounds like.
[54] And the man who's saying it is Alex Patterson, creator and CEO of Puell.
[55] It's the app that connects swimmers with nearby pools and allows pool owners to make extra money renting out their pools when not currently in use.
[56] Noah Lucas is here for Alex Patterson.
[57] Yeah, welcome.
[58] Let me see if Alex is ready.
[59] Okay.
[60] Thank you.
[61] Wow.
[62] I love the stuff for you.
[63] So many pools.
[64] That's Elena Lynn.
[65] Technically, she's the CFO at my investment firm Lucas Cap, but she's such a rock star that she also makes my schedule, gives me legal advice, produces this podcast, feeds my two -toed sloth, and manages my family and everything else I'm too busy for.
[66] Noah, you're standing in water.
[67] So I am.
[68] I rely on her perceptiveness, and she's right on the money here.
[69] I've accidentally waded into one of Puel HQ's many pools.
[70] There he is.
[71] No -wow.
[72] What's going on, man?
[73] Come on in.
[74] Welcome to Casa Puel.
[75] Got to get you guys a shot.
[76] Alex greeted me shirtless with a slight sunburn and a towel slung around his neck.
[77] He looks every bit the Florida State fratboy he was when he started his first venture, an app that connected underage teens with adults who would buy beer or rent cars for them, called wrecked.
[78] But he's swimming with the big dogs now.
[79] Let me just make that clear.
[80] It's not P -O -O -L.
[81] No, no. No. No, it's P -U -E -L, and we say Poo -L, I say Puel.
[82] P -L.
[83] The business model he dreamed up was simple.
[84] Alex envisioned a state -of -the -art sorting algorithm to match users with pools tailored perfectly to their swimming preferences.
[85] I wanted a system that would find you the best pool possible based on the data we have on you.
[86] So, for example, Poo -L knows my socioeconomic background, so it knows, you know, I don't want to swim in a pool that's like sketchy or, you know what I mean?
[87] So Alex had this great idea.
[88] But the next step was harder.
[89] He had to make it real.
[90] Alex told me the story in one of his several conference pools, all named after different celebrities who own pools.
[91] Let's walk into Blake Shelton.
[92] It's a lot nicer than the garage where he started this business, but he hasn't lost his rebellious spirit.
[93] Typing with a dackerie in his hand, he's more Jimmy Buffett than Warren Buffett.
[94] So once I left Florida State, I was getting all this pressure to get a job, but I was sort of like, what's the point, right?
[95] Why?
[96] So, you know, I went to Vietnam.
[97] and I was living there taking a couple of vacation years and I don't know if you've been there Vietnam rules it's a non -stop party it's super cheap to hire servants and you know you need them because you're partying a lot and shit just gets broken you have time to clean it up right the food is incredible if you like spice they've got hot sauce you know colonization by the French did so much for their cuisine and they're quite forgiving of the Vietnam War it actually has been barely a factor in their history no they're almost grateful for it because it created a lot of a good story to tell.
[98] You know, I ended up sort of pinching some code that I'd pirated from a company, a Vietnamese company called Mop, that tracks local cleaning ladies who own mopeds.
[99] Okay, and so I'm looking at the earliest version of Puel right now.
[100] It looks like the city of Hanoi with a bunch of pools moving around on it.
[101] Are those actual pools?
[102] No, well, those are cleaning ladies, but you can see where I'm headed there.
[103] Right, exactly.
[104] Okay.
[105] Then he showed me what the app looks like today.
[106] This is very intuitive.
[107] Do I want to find a pool with swimmers like me?
[108] Yes, of course.
[109] Wow, 208 pools in my area.
[110] Can that be right?
[111] Yeah, it's great.
[112] You can rank them by distance or star rating or pool shape, pool volume.
[113] We do pH.
[114] You've got dog -friendly, dog water slide.
[115] Dogs -only.
[116] Dogs -only.
[117] Concrete scratchiness, depending on what you like.
[118] Availability of grapes.
[119] But the default order is based on our proprietary pool -match algorithm.
[120] Patterson and I headed to a nearby pool that was listed as trending on the app, where he showed off his favorite community engagement strategy, rolling out a case of free rum, seemingly out of nowhere.
[121] Hey, who wants to see me drink a whole bottle?
[122] There we go, down the highway.
[123] The app was growing fast, but growth brought challenges of its own.
[124] People at some point were listing pools on the platform that were really just holes they dug on public property and filled it up with water.
[125] And the algorithm would recognize those holes as pools?
[126] Yeah, so we updated our community guidelines, which now enforces a strict pool definition, but users can also search for sun -warmed mud holes.
[127] So everyone's happy.
[128] There you go.
[129] But users found other ways to put themselves and Puel at risk.
[130] Oh, well, there was underage drinking and sex was happening at the pools.
[131] Right now, if that was me, I would immediately be seeking ways to plausibly deny knowledge of that.
[132] Well, in our terms of service, the swimmer agrees not to break the law, and the pool owner agrees to ensure compliance.
[133] And then, of course, they have to rate each other, which incentivizes lying about how great everything is.
[134] Right.
[135] It's L to L. L2L or liar -to -lier business models are well known in the valley.
[136] If you've ridden a bird scooter without a helmet, that's L2L.
[137] Puell was showing exponential month -over -month growth right from the jump.
[138] It's been an incredible opportunity for local businesses.
[139] In fact, we've just started working with a company where you can hire a clown to come for your kid's birthday party.
[140] And it's been huge.
[141] It's been unbelievable for the clown industry.
[142] Wow.
[143] We have, I think I just saw this the other day.
[144] We have in some areas 100 % clown penetration.
[145] That's unprecedented.
[146] It's been tremendous.
[147] But what's happening is parents have figured out how to hack the app.
[148] So what they're doing is they're having these after school pool parties from, you know, 3 o 'clock to 6 o 'clock on the weekdays.
[149] And they're hiring the clowns to come.
[150] And basically the clowns are now babysitting the kids.
[151] That's amazing.
[152] The clowns must love that.
[153] Oh, yeah.
[154] The clowns are over the moon about that.
[155] In fact, people are saying we're potentially creating a clown bubble, which we don't want to see.
[156] Here come the haters.
[157] Send in the clowns, and here come the haters.
[158] Elena, do your friends use this?
[159] Are people younger than you using this?
[160] I mean, it sounds cool, but I don't really have time to be swimming at strangers' pools.
[161] Right, yeah, right.
[162] Elena, she just means she doesn't swim.
[163] She caught a staff infection at a water park when she was 14.
[164] Why is that the only fact you remember about me after five years?
[165] I remember a lot more facts about you.
[166] I just like staff infections.
[167] But I could tell you this, within six months of our launch, we controlled more pools than anyone in human history.
[168] Even Napoleon.
[169] That's an accomplishment.
[170] But underneath all that celebration, a crisis was brewing.
[171] Some people began to suspect Puell's algorithm was sorting swimmers based on another factor, the color of their skin.
[172] So all of a sudden, people are calling me a segregationist, like I'm Jim Crow in a bathing suit.
[173] It's like, dude, I run a pool app, okay?
[174] I don't even know who Jim Crow is.
[175] Well, the Jim Crow laws were state and local laws that enforced racial segregation in the South.
[176] I know.
[177] As we'll find out in a second, it wasn't all smooth swimming ahead for Puell.
[178] Would they be able to recover?
[179] And would they win my investment?
[180] Hear that change in the music?
[181] On most podcasts, that indicates an ad, but not on smarter.
[182] You see, as an innovator, I'm on the clock 24 hours a day, so I'll also be giving you a glimpse of my life at home, where there are many crucial business lessons to be learned.
[183] Like knowing how to capitalize.
[184] It's what sets apart a successful investor from a run -of -the -mill mail bomb victim.
[185] Today I'll be capitalizing on my victimhood in front of TV cameras with my beautiful wife, Bronte.
[186] You booked us on the Today Show?
[187] Are we trying to hide from the bomber in 1991?
[188] That's Bronte there.
[189] She's a well -regarded artist, known for her immersive hologram installations, and she's already famous, so she won't mind that I've bugged our entire home and started recording all of our interactions.
[190] No, you can't do tomorrow anyway.
[191] The FBI guy is coming back here at 10 a .m. You talk to the FBI?
[192] Mm -hmm.
[193] We can't be begging for help from the government, maybe.
[194] That makes us look afraid.
[195] We'll lose investor confidence.
[196] Just talk to him.
[197] I'm not going to get blown up by a bomb, unless it's on my own terms, and that's very far from the focus of my work right now.
[198] Listen, I'll cooperate if you come on the Today Show with me. It'll be fun.
[199] You could introduce yourself to Middle America.
[200] My whole brand hinges on me not appealing to Middle America.
[201] There's a cool morning show on Twitch we could do instead.
[202] It uses the Turoc dinosaur hunter graphics engine.
[203] Bronte.
[204] We've got 10 times videos of any morning network show.
[205] We have 24 hours to capitalize on this.
[206] And then it's under the next new bag of celery or whatever they promote on that stupid show.
[207] Can't you just do it alone?
[208] Tell them I'm in the hospital, replacing my lungs with bellows.
[209] I couldn't convince Bronte to play along, so I swallowed hard and burned my bridge with Savannah Guthrie.
[210] Marriage is all about compromise.
[211] Alex Patterson's company, Puell, was growing astronomically and expanding faster than any of his models predicted.
[212] Oh, it was a blast.
[213] I mean, at a certain point, we stopped trying to predict things, or really do any work at all because the algorithm was just taking care of everything.
[214] We started having more parties, we're drinking at work.
[215] The office kind of became our little Vietnam, not in the sense of the war.
[216] In the good way, the modern Vietnam.
[217] But his hands -off approach allowed larger social problems to take root in his utopian pool -going app.
[218] Swimmers noticed that the more they used Puel, the more racially homogenous their pools became, like they were only being sent to swim with their own race.
[219] Users began to suspect the app was segregating them.
[220] The app was sorting users based on their incomes, school districts, purchase histories, the music they're listening to, I don't know, the skin creams that they buy.
[221] And we didn't realize how deeply, racially encoded all those factors are.
[222] And frankly, you know, that's society's fault.
[223] A lot of cities were de facto segregated already long before Puell got there.
[224] and then people are rating their pools on how much they like swimming there so when a white person swims in an all -white pool and then gives it a good rating that's teaching the algorithm that white people like swimming together exactly we were being victimized by racist inputs being embedded in our platform by users who may not even know that they're racist it's not our racist no it's not your race no it's not coming from us and it's frustrating because from where I'm sitting the technology was performing fine you know so you So your app is enforcing whites -only pools, and to you, that's performing correctly?
[225] Well, okay.
[226] I want to be clear about something that I am not a segregationist or a racist.
[227] I'm thrilled that these issues were settled in the 60s, and I have no interest in seeing them relitigated.
[228] So this firestorm is growing on social media, and your customer support staff wasn't telling you about this?
[229] Well, we don't have a customer support staff, okay?
[230] All those calls, they go to one phone that anyone in the office can answer.
[231] It's really amazing.
[232] You know, you could call that number and I could pick up.
[233] I mean, unfortunately, that one phone got dropped in the toilet during one of our big parties, but I think everyone just assumed that somebody else was going to replace it.
[234] It didn't happen.
[235] Whatever.
[236] But I know now that I should have been more focused on it because suddenly states were banning us for violating the Civil Rights Act of 1964.
[237] Right.
[238] How many states?
[239] Its current count is 20, but we're appealing all those.
[240] Okay.
[241] So being banned in 20 states, obviously crimps your growth profile.
[242] How do you respond to that?
[243] Well, we added a, I don't feel comfortable at this Poole button.
[244] The idea is that you would use it if you found yourself in a segregated pool.
[245] Well, I mean, I would certainly feel uncomfortable in a situation like that.
[246] And that was our intention.
[247] But in practice, the feature was mostly abused by white people who got nervous when a person of color would begin swimming nearby.
[248] And you didn't have any way to account for that?
[249] Excuse me. If I could make an app that could end all racism and also scale and be profitable, I would absolutely make it.
[250] It's just a challenge we weren't ready for.
[251] Patterson had given his users every chance to teach the algorithm not to be racist, and they'd let him down.
[252] So once it became clear we couldn't fix the problem with the app, we set up this brilliant system where users would be bused for free to pools and other neighborhoods to achieve racial parity.
[253] Right, and did that work?
[254] I mean, buses really only seem to make matters worse.
[255] You know, I think people just...
[256] love to complain.
[257] So you got a lot of people's attention, like Congresswoman Alexandria Ocasio Cortez.
[258] Well, she really came after us, huh?
[259] She whipped up all this outrage when she didn't have access to all the facts.
[260] Now, I have the tweets right here.
[261] I want to get this right.
[262] You commanded her to debate you on the issue of pool sharing.
[263] Well, as I've stated, I regret using the term I command you, but the invitation still stands, and I think it would be very enlightening for her.
[264] Puel knew they needed help.
[265] So they worked on their outreach by hiring CNN's Van Jones.
[266] as a diversity consultant.
[267] I figured just hiring would solve everything, you know, but I'm not really sure what he did for us.
[268] You know, he showed up a couple of these pools, took our money, swam in the pools, and then we never heard from him again.
[269] Did you take his advice?
[270] Well, the ideas that Van brought to the table just would have been impossible to implement and just couldn't, would not work with the algorithms we have set up.
[271] You know, Van Jones isn't an engineer, and he doesn't understand that the change, changes he sees us making, it would be impossible without spending a lot of time and a lot of money.
[272] So the software fix doesn't work.
[273] Van Jones leaves you high and dry.
[274] What's left to try?
[275] We're innovators, right?
[276] We have to win with a good product.
[277] We went back to basics, something we knew was guaranteed to win.
[278] We invented a new type of water.
[279] I see.
[280] It's a proprietary formula, cheaper than H2O, and 50 % harder to drown him.
[281] It's amazing.
[282] We offered it for first to non -white pool goers and then anyone else who was upset by the segregation crisis by inviting them to enjoy a free month of swimming.
[283] That also helped us beta test the new pool fill.
[284] Best of all, the new chemicals allow you to discreetly relieve yourself in the pool with no risk of contamination.
[285] Urine, fecal matter, vomit, and wood chips all dissolve immediately.
[286] The only problem?
[287] It can cause certain skin tones to glow in the dark for two to five weeks after swimming.
[288] We should have tested the chemical more.
[289] For me, it was a first -hand lesson in the value of a diverse QA team, and we're currently reformulating with an emphasis on not illuminating the skin of Latinx and African Americans.
[290] Lesson learned.
[291] So it's been really pretty great.
[292] Puel was pivoting their entire company towards making amends.
[293] But with Puel's becoming increasingly racially charged, tragedy was inevitable.
[294] So what we know is this.
[295] A police officer responding to a trespassing call confronted a group of teens, and there was some kind of interaction.
[296] He claims he saw a glow -in -the -dark alien.
[297] He feared for his life and shot two unarmed swimmers.
[298] And we take that very seriously.
[299] He also shot a clown.
[300] So no tacherees that day, I imagine.
[301] Certainly fewer.
[302] But they're all supposed to make a full recovery.
[303] Not the clown.
[304] What do you do when you create a perfect machine, but the humans who use it are flawed?
[305] In In 1945, a man named Kalashnikov invented an immensely popular automatic weapon, the AK -47.
[306] Is he responsible if someone uses it to murder?
[307] Of course not.
[308] But sadly, tech firms are segregated into a category of their own and discriminated against for the problems people create using the inherently neutral tools we give them.
[309] As an investor, I'm always looking for the next AK -47.
[310] But if Puell was going to make it, Alex needed to hear some hard truths.
[311] Well, here's how I see it, Alex.
[312] Your users put you in this position, and they need to be held accountable for what they've done to your algorithm.
[313] You have to start suing your users.
[314] That's not what I wanted to hear.
[315] It's the only way forward.
[316] You already have all the data on every pool swimmer and owner that forced segregation into your pools.
[317] Just connect the dots and report them to the Justice Department.
[318] Well, I'm a little worried about how annihilating our user base would play.
[319] Right, but think of it as protecting your platform, right?
[320] Your algorithm is being defiled by your hateful user base.
[321] Right now, it's vulnerable and probably frightened.
[322] Your creation needs you now more than ever.
[323] Would you stand up to the bigots and restore your algorithm's good name?
[324] All right, well, let me have a drink and I'll think on it.
[325] Not able to fully capitalize on the bombing by appearing on the Today Show with Bronte, I set out to do the next best thing and visit the postal worker who was burned by the bomb for a mutually beneficial photo op. Mr. Ralph, I'd like to thank you for your service by presenting you a gift.
[326] This is a working prototype of a mail -delivering robot that my company's been developing to replace postal workers.
[327] Had the robot delivered the mail instead of you, he would have been back on his rounds already rather than laying around in a government subsidized hospital.
[328] Oh, look at that.
[329] He's delivered you the mail.
[330] Folks, he's still a little bit shell -shocked.
[331] He's shell -shocked by the bomb that exploded in his face.
[332] We need to live enough of a little time.
[333] Thanks, not everybody.
[334] Any updates on the bomber?
[335] No questions right now.
[336] This is just for publicity.
[337] I would like to announce my new podcast coming to Luminate.
[338] The photo op went great, But after the press cleared out, a grave -looking man approached me. Mr. Lucas, you want a photo real quick.
[339] Mr. Lucas, I'm Agent Eddie Clark from the FBI.
[340] I'm investigating the mail bomb you received.
[341] I talked to your wife earlier.
[342] Mr. Lucas, we really need to talk about this.
[343] This investigator had already frightened my wife.
[344] Now he was taking advantage of my publicity stunt to corner me and waste my time.
[345] He was not making a good impression.
[346] I'm hoping that we could talk just about who you think might have sent the bomb.
[347] If you have been getting any threats, anything.
[348] at all could be helpful.
[349] Conducting an investigation after a bomb scare is just feeding the trolls, okay?
[350] If people find out I had to waste all this time with you, they're just going to keep sending the bombs, all right?
[351] Mr. Lucas, I'm trying to keep your family safe.
[352] Oh, really?
[353] Well, then why aren't you spending all your capital on a plan to send my descendants to Mars in the event a rogue comet comes to destroy the earth?
[354] Oh, that's right, because your motives are all egocentric.
[355] I'm sorry, I have to go.
[356] Mr. Lucas, the FBI tries their best each time.
[357] That's our credo.
[358] I can take care of myself.
[359] I've got plenty of grenades at my skyscraper, but your work is appreciate you have explosives in your skyscraper i was just joking i wish the fbi all the luck in the world in finding my mail bomber but they never will so i've stopped thinking about it when we left off i had convinced puel's CEO alex patterson to sue his users for making his platform racist a few weeks passed and it was time to decide if i was going to invest in his company or not okay elena i've got a big decision to make yep what do you think of puel i've given them the blueprint the question is, are they following through?
[360] I haven't seen anything in the press about the lawsuits.
[361] Well, you don't launch a PR campaign to notify people.
[362] You're suing them.
[363] You sue everyone individually to keep them all frightened and confused.
[364] I mean, if you like the app, then maybe we should just try to reproduce some of their IP without them noticing.
[365] Because personally, I wouldn't want to be associated with an app that's racist.
[366] Really?
[367] I mean, I'm just not convinced that people care about this stuff anymore.
[368] I feel like racism is just an online game people play.
[369] Am I wrong?
[370] Yes, I think you're wrong.
[371] All right, let's get Alex on the phone.
[372] But then we had an unusual problem.
[373] We couldn't get a hold of Alex.
[374] Hello.
[375] No, come on.
[376] We're in the back.
[377] Alex, hello?
[378] Just come in the back.
[379] Hello?
[380] Yeah, we're in the back.
[381] We're in the back.
[382] Alex, hello.
[383] That was the last phone call we had.
[384] More weeks went by, no lawsuits filed, no changes to the platform.
[385] Puell continued to take on water.
[386] The alt -right started using Puell's as their summer base.
[387] Which attracted followers of Antifa.
[388] No bachelors, no walls, no swimming pools at all.
[389] This led to multiple incidents of pools exploding into political violence, like the kind caught on this viral video.
[390] This is the police, everybody out of the pool.
[391] Puel was slow to respond.
[392] They didn't address the alt -right at all, instead condemning Antifa's black -block swimming techniques, including swimming with masks and opaque goggles on.
[393] From the outside, it seemed like more of a mess than ever.
[394] Honestly, I had given up on Puell.
[395] Then, I got a phone call.
[396] Hey, uh, Noah.
[397] How's it going?
[398] Um, listen, I'm calling everyone in my life to apologize.
[399] What the fuck?
[400] Do you understand how much my time you wasted?
[401] You have exactly 30 seconds to explain where the fuck you've been.
[402] Alex was calling from a rehab facility.
[403] Puell's board was growing concerned about the company's PR disasters and were considering a takeover.
[404] So Alex blamed his management struggles on an alcohol problem.
[405] and entered treatment.
[406] Yeah, I'm in rehab.
[407] Can't fire a guy in rehab, huh?
[408] This place is actually kind of cool.
[409] It's like AA, but also Raya, because, you know, only rich people can go.
[410] Alex explained to me that the stress of being implicated in a massive racial crisis drove him to drink.
[411] Hey, did I drink far too much?
[412] I'm realizing that, yes, I did.
[413] But it was just to avoid a problem I couldn't solve.
[414] The problem of how society tries to sweep segregation under the rug.
[415] And Noah, I've got to be honest with you.
[416] I'm starting to think society itself should enter treatment.
[417] But there was still life in him.
[418] Alex fired his entire office staff for failing to prevent him from descending into alcoholism.
[419] And he was eager to discuss the plans he still had for Puel.
[420] The first 30 ,000 summonses went out last week.
[421] We'll probably be bogged down in lawsuits for the next four or five years.
[422] But after that, I think we're looking really good.
[423] A lot of users deleted the app, but to that I say good riddance.
[424] You know?
[425] It just proves you're a racist.
[426] We're calling the herd.
[427] We're getting rid of the sort of the nastiness.
[428] I listened and wished him luck.
[429] Now it was my turn to make a tough choice.
[430] Okay, so you just got off the phone with Alex.
[431] Are you going to invest?
[432] Well, Patterson's an idiot, but he's a convenient fall guy for any future problems.
[433] I can tell just by his voice he'll relapse.
[434] Frankly, I'm not really interested in Puell anymore.
[435] Or Pools generally.
[436] People can't be trusted in groups.
[437] The future is in personalized, ad -supported single occupancy swim pods.
[438] I don't know.
[439] Poole may be a good acquisition target.
[440] Rumor is a major insecticide company might buy it for its proprietary water blend.
[441] Yeah, I think that's the right place for it.
[442] To be honest, I'm most interested in the database they've built of people who may be racist.
[443] That's a trove of information that could be more valuable than gold.
[444] I threw it to Patterson, who was in no condition to drive a hard bargain.
[445] Alex, I'm not going to invest in Puel.
[446] But listen, you're probably incurring a lot of upfront costs, on all the paperwork with the courts right now, so I'll give you $3 million right now, just for your racism data.
[447] $3 million?
[448] Mm -hmm.
[449] The data is worth much more than that, don't you think?
[450] Not to you.
[451] To you, it's toxic.
[452] I mean, sitting there waiting to get breached, which will destroy your company, then you won't be able to monetize any of it.
[453] This is my one and only offer to help you dispose of it.
[454] If I don't sell it to you, will you promise not to hack it?
[455] Absolutely not.
[456] I told Alex to take all the time he needed, but if I didn't hear back from him soon, I'd start bad -mouthing him to other VCs and saying he relapsed.
[457] Less than three minutes later, the phone rang.
[458] Okay.
[459] Okay.
[460] You know what?
[461] I'm in.
[462] That makes me very happy, Alex.
[463] And by the way, the idea you had about me hacking your company, I hadn't even thought of that until you said it, so don't make a stupid mistake like that ever again.
[464] I definitely could have done it easily, especially with all the access you've given me. So basically, I just gave you a huge gift, okay?
[465] Okay.
[466] Shut the fuck up.
[467] Puel had an issue we see often in tech.
[468] It was a great platform with a people problem.
[469] For all the strides we've made in automation, the attention and micro -transactions of people are still necessary to power the digital economy.
[470] But unlike algorithms, people are flawed.
[471] For now at least, the CEO is the only kind of person who can't yet be replaced by a machine, which means I'll have to scale my impact the old -fashioned way, by passing my knowledge to others, making all of you less powerful versions of me. Maybe that's what this podcast is all about.
[472] And when I think of it that way, I'm overwhelmed with emotion at the thought of my own generosity.
[473] Smarter is a production of Team Coco and Luminary Media.
[474] It is created by Sam West, Matt Kleinman, and Chris Sartinsky.