Acquired XX
[0] You like my Bucks T -shirt?
[1] I love your Bucks T -shirt.
[2] I went for the first time, what, two weeks ago when I was down for meeting a benchmark and the nostalgia in there was just unbelievable.
[3] I can't believe you hadn't been before.
[4] I know Jensen is a Denny's guy, but I feel like he would meet us at Bucks if we asked him.
[5] Or at the very least, we should figure out some NVIDIA memorabilia to get on the wall at Bucks.
[6] Totally.
[7] Fit right in.
[8] All right, let's do it.
[9] Let's do it.
[10] Is it you, is it you who got the truth now?
[11] Is it you, is it you?
[12] Sit me down, say it straight.
[13] Another story.
[14] Welcome to Season 13, Episode 3 of Acquired, the podcast about great technology companies and the stories and playbooks behind them.
[15] I'm Ben Gilbert.
[16] I'm David Rosenthal.
[17] And we are your hosts.
[18] Today, we tell a story that we thought we had already finished, Nvidia.
[19] But the last 18 months have been so insane listeners that it warranted an entire episode on its own.
[20] So today is a part three for us with NVIDIA telling the story of the AI revolution, how we got here, and why it's happening now, starting all the way down at the level of atoms and silicon.
[21] So here's something crazy that I did a transcript search on to see if it was true.
[22] In our April 2022 episodes, we never once said the word generative.
[23] That is how fast things have changed.
[24] Unbelievable.
[25] Totally crazy.
[26] And the timing of all of our all of this AI stuff in the world is unbelievably coincidental and very favorable.
[27] So recall back to 18 months ago.
[28] Throughout 2022, we all watched financial markets from public equities to early stage startups to real estate just fall off a cliff due to rapid rise in interest rates.
[29] The crypto and Web 3 bubble burst, banks fail.
[30] It seemed like the whole tech economy, and potentially a lot with it, was heading into a long winter.
[31] Including Nvidia.
[32] Including Nvidia, who had that massive inventory write -off for what they thought was over -ordering.
[33] Yep.
[34] Wow, how things have changed.
[35] Yeah.
[36] But by the fall of 2022, right when everything looked the absolute bleakest, a breakthrough technology finally became useful after years in research labs, large language models or LLMs, built on the innovative transformer machine learning mechanism burst onto the scene.
[37] First, with OpenAI's chat GPT, which became the fastest app in history to 100 million active users, and then quickly followed by Microsoft, Google, and seemingly every other company.
[38] In November of 2022, AI definitely had its Netscape moment, and time will tell, but it may have even been its iPhone moment.
[39] Well, that is definitely what Jensen believes.
[40] Yep.
[41] Well, today we'll explore exactly how this breakthrough came to be, the individuals behind it, and of course, why the entire thing has happened on top of NVIDIA's hardware and software.
[42] If you want to make sure you know every time there's a new episode, go sign up at Acquired .fm slash email.
[43] You'll also get access to two things that we aren't putting anywhere else.
[44] One, a clue as to what the next episode will be, and two follow -ups from previous episodes from things that we learned after release.
[45] You can come talk about this episode with us after listening at Aqqqqq.
[46] acquired .fm slash slack.
[47] If you want more of David and I, check out our interview show, ACQ2.
[48] Our next few episodes are about AI, with CEOs leading the way in this world we are talking about today, and a great interview with Doug Demiro, where we wanted to talk about a lot more than just Porsche with him, but we only had 11 hours or whatever we had in Doug's garage.
[49] So a lot of the car industry chat and learning about Doug and his journey and his business, we saved for ACQ 2, so go check it out.
[50] One final announcement, many of you have been wondering, and we've been getting a lot of emails, when will those hats be back in stock?
[51] Well, they're back.
[52] For a limited time, you can get an ACQ embroidered hat at Acquired .fm slash store.
[53] Go put your order in before they go back into the Disney vault forever.
[54] This is great.
[55] I can finally get Jenny one of her own, so she stops stealing mine.
[56] Yes.
[57] Well, without further ado, this show is not investment advice, David and I may have investments in the companies we discuss, and this show is for informational and entertainment purposes only, David, history and facts.
[58] Oh, man. Well, on the one hand, we only have 18 months to talk about.
[59] Except that I know you're not going to start 18 months ago.
[60] On the other hand, we have decades and decades of foundational research to cover.
[61] So when I was starting my research, I went to the natural first place, which was our old episodes from April 2022, and I was listening to them, and I got to the end of the second one.
[62] And, man, I had forgotten about this.
[63] I think Jensen maybe wishes we all had forgotten about this in one of NVIDIA's earning slides in 2021.
[64] They put up their total addressable market, and they said they had a one trillion dollar tam.
[65] And the way that they calculated this was that they were going to serve customers who provided $100 trillion worth of industry, and they were going to capture just 1 % of it.
[66] And there was some stuff on the slide that was fairly speculative, you know, like autonomous vehicles and the omniverse and I think robotics were a big part of it.
[67] And the argument is basically like, well, cars plus factories, plus all these things added together is 100 trillion.
[68] And we can just take 1 % of that because surely their compute will amount to 1 % of that, which I'm not arguing is wrong, but it is a very blunt way to analyze that market.
[69] Yeah, it's usually not the right way to think about starting a startup.
[70] You know, oh, if we can just get one percent of this big market, blah, blah, blah.
[71] It's the topiest down way I can think of to size a market.
[72] So you, Ben, rightly so called this out at the end of NVIDIA Part 2.
[73] And you're like, you know, I think to justify where NVIDIA is trading at the moment, you kind of actually got to believe that all of this is going to happen and happen soon, autonomous cars, robotics, everything.
[74] Yeah.
[75] Importantly, I felt like the way for them to become worth what they were worth at that time literally had to be to power all of this hardware in the physical world.
[76] Yep.
[77] I kind of can't believe that I said this because it was unintentional and uninformed, but I was kind of grasping at straws trying to play devil's advocate for you.
[78] And we just spent most of that whole episode talking about how machine learning, powered by Nvidia ended up having this incredibly valuable use case, which was powering social media feed recommenders, and that Facebook and Google had grown bigger than anyone ever imagined on the internet with those feed recommendations, and Nvidia was powering all of it.
[79] And so I just sort of idly proposed, well, maybe, but what if you don't actually need to believe any of that to still think that Nvidia could be worth a trillion dollars.
[80] What if?
[81] Maybe, just maybe, the internet and software and the digital world are going to keep growing.
[82] And there will be a new foundational layer that Nvidia can power.
[83] Is that possible?
[84] And I think we were both like, yeah, I don't know, let's end the episode.
[85] Yeah, sure.
[86] We shrugged it off and we were like, all right, carve outs.
[87] But the crazy thing is that, of course, at least in this time frame, most things on Jensen's trillion damn slide have not come to pass, but that crazy question just might have come to pass.
[88] And from NVIDIA's revenue and earnings standpoint, definitely has.
[89] It's just wild.
[90] All right.
[91] So how did we get here?
[92] Let's rewind and tell the story.
[93] So back in 2012, there was the Big Bang moment of artificial intelligence, or as it was more humbly referred to back then, machine learning.
[94] And that was Alex We talked a lot about this on the last episode.
[95] It was three researchers from the University of Toronto who submitted the Alexnet algorithm to the ImageNet computer science competition.
[96] Now, ImageNet was a competition where you would look at a set of 14 million images that had been hand -labeled with what the pictures were of, like of a strawberry or a cat or a dog or whatever.
[97] And, David, you were telling me it's the largest ever use of mechanical Turk up to that point was to label the ImageNet data set?
[98] Yeah, it's wild.
[99] I mean, until this competition and until AlexNet, there was no machine learning algorithm that could accurately label images.
[100] So thousands of people on Mechanical Turk got paid however much two bucks an hour to label these images.
[101] Yeah, and if I'm remembering from our episode, basically what happened is the AlexNet team did way better than anybody else had ever done.
[102] The complete step change better.
[103] I think the error rate went from mislabeling images 25 % of the time to suddenly only mislabeling them 15 % of the time.
[104] And that was like a huge leap over the tiny incremental progress that have been made along the way.
[105] You're spot on.
[106] And the way that they did it and what completely changed the fortunes of the internet, of Google, of Facebook, and certainly of invidia, was they actually used old algorithms, a branch of computer science and artificial intelligence called neural networks, specifically convolutional neural networks, which had been around since the 60s, but they were really computationally intensive to train.
[107] And so nobody thought it would be practical to actually train and use these things, at least not anytime soon or in our lifetimes.
[108] And what these guys from Toronto did is they went out probably to their local best buy or equivalent in Canada.
[109] They bought two G -Force GTX -580s, which were the top of the line cards at the time.
[110] And they wrote their algorithm, their convolutional neural network in Kuda in NVIDIA's software development platform for GPUs, and by God, they trained this thing on like $1 ,000 worth of consumer -grade hardware.
[111] And basically, the algorithm that other people had been trying over the years just wasn't massively parallel the way that a graphics card sort of enables.
[112] So if you actually can consume the full compute of a graphics card, then perhaps.
[113] You could run some unique novel algorithm and do it on, you know, a fraction of the time and expense that it would take in these supercomputer laboratories.
[114] Yeah, everybody before was trying to run these things on CPUs.
[115] CPUs are awesome, but they only execute one instruction at a time.
[116] GPUs, on the other hand, execute hundreds or thousands of instructions at a time.
[117] So GPUs, Nvidia, graphics cards, accelerated computing, what Jensen and the company likes to call them.
[118] you can really think of it like a giant Archimedes lever.
[119] Whatever advances are happening in Moore's Law and the number of transistors on a chip, if you have an algorithm that can run in parallel, which is not all problem spaces, but many can, then you can basically lever up Moore's Law by hundreds of times or thousands of times or today tens of thousands of times and execute something a lot faster than you otherwise could.
[120] And it's so interesting that there was this first market called graphics that was obviously parallel where every pixel on a screen is not sequentially dependent on the pixel next to it.
[121] It literally can be computed independently and output to the screen.
[122] So you have however many tens of thousands or now hundreds of thousands of pixels on a screen that can all actually be done in parallel.
[123] And little did Nvidia realize, of course, that AI and crypto and all this other linear algebra matrix math -based things that turned into accelerated computing, pulling things off the CPU and putting them on GPU and other parallel processors, was an entire new frontier of other applications that could use the very same technology they had pioneered for graphics.
[124] Yeah, it was pretty useful stuff.
[125] And this Alex Knapp moment and these three researchers from Toronto kicked off, Jensen calls it, and he's absolutely right, the Big Bang moment for AI.
[126] So David, the last time we told this story in full, we talked about this team from Toronto.
[127] We did not follow what this team of three went on to do afterwards.
[128] Yeah.
[129] So basically what we said was it turned out that a natural consequence of what these guys were doing was, oh, actually you can use this to surface the next post in a social media feed on like an Instagram feed or the YouTube feed or something like that.
[130] And that unlocked billions and billions of value.
[131] And those guys and everybody else working in the field, they all got scooped up by Google and Facebook.
[132] Well, that's true.
[133] And then as a consequence of that, Google and Facebook started buying a lot of Nvidia GPUs.
[134] But turns out there's also another chapter to that story that we completely skipped over.
[135] And it starts with the question you asked, Ben, who are these people?
[136] Yes.
[137] So the three people who made up the Alex Net team were, of course, Alex Koshavsky, who was a PhD student, under his faculty advisor, the legendary computer science professor, Jeff Hinton.
[138] I have an amazing piece of trivia about Jeff Hinton.
[139] Do you know who his great, great grandparents were?
[140] No, I have no idea.
[141] He is the great -great -grandson of George and Mary Boole.
[142] You know, like Boolean algebra and Boolean logic?
[143] This guy was born to be a computer science researcher.
[144] Oh, my God.
[145] Right.
[146] Foundational stuff for computation and computer science.
[147] I also didn't know there were people named Boole, that that's where that came from.
[148] That's hilarious.
[149] Yeah, you know, the and or XOR, NOR operators.
[150] That comes from George and Mary.
[151] Wild.
[152] So he's the faculty advisor.
[153] And then there was a third person on the team, Alex's fellow PhD student in this lab, when Ilya Sutskiver.
[154] And if you know where we're going with this, you are probably jumping up and down right now in your seat.
[155] Ilya is the co -founder and current chief scientist of OpenA