Insightcast AI
Home
© 2025 All rights reserved
Impressum
#96 – Stephen Schwarzman: Going Big in Business, Investing, and AI

#96 – Stephen Schwarzman: Going Big in Business, Investing, and AI

Lex Fridman Podcast XX

--:--
--:--

Full Transcription:

[0] The following is a conversation with Stephen Schwartzman, CEO and co -founder of Blackstone, one of the world's leading investment firms with over $530 billion of assets under management.

[1] He's one of the most successful business leaders in history.

[2] I recommend his recent book called What It Takes, that tells stories and lessons from his personal journey.

[3] Stephen is a philanthropist and one of the wealthiest people in the world, recently signing the Giving Pledge, thereby committing to give the majority of his wealth to philanthropic causes.

[4] As an example, in 2018, he donated $350 million to MIT to help establish its new College of Computing, the mission of which promotes interdisciplinary, big, bold research in artificial intelligence.

[5] For those of you who know me, know that MIT is near and dear to my heart, and always will be.

[6] It was and is a place where I believe, big, bold, revolutionary ideas have a home.

[7] And that is what is needed in artificial intelligence research in the coming decades.

[8] Yes, there's institutional challenges, but also there's power in the passion of individual researchers, from undergrad to Ph .D., from young scientists to senior faculty.

[9] I believe the dream to build intelligence systems burns brighter than ever in the halls of MIT.

[10] This conversation was recorded recently, but before, the outbreak of the pandemic.

[11] For everyone feeling the burden of this crisis, I'm sending love you away.

[12] Stay strong.

[13] We're in this together.

[14] This is the Artificial Intelligence Podcast.

[15] If you enjoy it, subscribe on YouTube, review it with five stars on Apple Podcast, support on Patreon, or simply connect with me on Twitter, at Lex Friedman, spelled F -R -I -D -M -A -N.

[16] As usual, I'll do a few minutes of ads now and never any ads in the middle that can break the flow of the conversation.

[17] hope that works for you and doesn't hurt the listening experience.

[18] Quick summary of the ads.

[19] Two sponsors, Masterclass and ExpressVPN.

[20] Please consider supporting the podcast by signing up to Masterclass at Masterclass .com slash Lex.

[21] And getting ExpressVPN at ExpressVPN .com slash LexPod.

[22] This show is sponsored by Masterclass.

[23] Sign up at Masterclass .com slash Lex to get a discount and support this podcast.

[24] When I first heard about Masterclass, I thought it was too good to be true.

[25] For $180 a year, you get an all -axis pass to watch courses from to list some of my favorites.

[26] Chris Hadfield on Space Exploration, Neil deGrasse Tyson on Scientific Thinking Communication, Will Wright, creator of SimCity and Sims on game design, Carlos Santana on guitar, Gary Kasparov on chess, Daniel Nagrano on poker, and many, many more.

[27] Chris Hadfield explaining how rockets work and the experience of being launched into space alone is worth of money.

[28] By the way, you can watch it on basically any device.

[29] Once again, sign up at masterclass .com slash Lex to get a discount and to support this podcast.

[30] This show is sponsored by ExpressVPN.

[31] Get it at ExpressVPN .com slash LexPod to get a discount and to support this podcast.

[32] I've been using ExpressVPN for many years.

[33] I love it.

[34] It's easy to use.

[35] Press the big power on button, and your privacy is protected.

[36] And, if you like, you can make it look like your location is anywhere else in the world.

[37] I might be in Boston now, but it can make you look like I'm in New York, London, Paris, or anywhere else in the world.

[38] This has a large number of obvious benefits.

[39] Certainly, it allows you to access international versions of streaming websites like the Japanese Netflix or the UK Hulu.

[40] ExpressVPN works on any device you can imagine.

[41] I use it on Linux.

[42] Shout out to Ubuntu, 2004, Windows, Android, but it's available everywhere else to.

[43] Once again, get it at expressvpn .com slash LexPod to get a discount and to support this podcast.

[44] And now here's my conversation with Stephen Schwartzman.

[45] Let's start with a tough question.

[46] What idea do you believe, whether grounded in data or in that many people your respect disagree with you on?

[47] Well, there isn't all that much anymore since the world's so transparent.

[48] But one of the things I believe in, put it in the book, what it takes, is if you're going to do something, do something very consequential, do something that's quite large if you can that's unique.

[49] because if you operate in that kind of space, when you're successful, it's a huge impact.

[50] The prospect of success enables you to recruit people who want to be part of that.

[51] And those type of large opportunities are pretty easily described.

[52] And so not everybody likes to operate at scale.

[53] Some people like to do small things because it is meaningful for them emotional.

[54] And so occasionally you get a disagreement on that.

[55] But those are life choices rather than commercial choices.

[56] That's interesting.

[57] What good and bad comes with going big?

[58] We often in America think big is good.

[59] What's the benefit, what's the cost in terms of just bigger than business, but life, happiness, the pursuit of happiness?

[60] Well, you do things that make you happy.

[61] It's not mandated, and everybody's different.

[62] And some people, you know, if they have talent, like playing pro football, you know, other people just like throwing the ball around, you know, not even being on a team.

[63] What's better?

[64] Depends what your objectives are.

[65] Depends what your talent is.

[66] You know, it depends, you know, what gives you joy.

[67] so in terms of going big is it both for impact on the world and because you personally gives you joy well it makes it easier to succeed actually because if you catch something for example that's cyclical that's that's a huge opportunity then then you usually can find some place within that huge opportunity where you can make it work if if you're prosecuting a a really small thing, and you're wrong, you don't have many places to go.

[68] So, you know, I've always found that the easy place to be.

[69] And, you know, the ability where you can concentrate human resources, get people excited about doing, like, really impactful big things, and you can afford to pay them, actually, because the bigger thing can generate much more in the way of financial resources.

[70] So that brings people out of talent to help you.

[71] And so altogether it's a virtuous circle, I think.

[72] How do you know an opportunity when you see one in terms of the one you want to go big on?

[73] Is it intuition?

[74] Is it facts?

[75] is it back and forth deliberation with people you trust what's the process is it art is it science well it's pattern recognition and how do you get to pattern recognition first you need to understand the patterns and and the changes that are happening and that's that's either it's observational on some level.

[76] You can call it data, or you can just call it listening to unusual things that people are saying that they haven't said before.

[77] And, you know, I've always tried to describe this.

[78] It's like seeing a piece of white lint on a black dress, but most people disregard that piece of lint.

[79] They just see the dress.

[80] I always see the lint.

[81] And I'm fascinated by how did something get someplace it's not supposed to be.

[82] So it doesn't even need to be a big discrepancy.

[83] But if something shouldn't be someplace and a constellation of facts that, you know, sort of made sense in a traditional way, I've learned that if you focus on why one discordant note is there, that's usually a key.

[84] to something important.

[85] And if you can find two of those discordant notes, that's usually a straight line to someplace.

[86] And that someplace is not where you've been.

[87] And usually when you figure out that things are changing or have changed, and you describe them, which you have to be able to do, because it's not some odd intuition.

[88] It's just focusing on facts.

[89] It's almost like a scientific discovery, if you will, when you describe it to other people in the real world, they tend to do absolutely nothing about it.

[90] And that's because humans are comfortable in their own reality.

[91] And if there's no particular reason at that moment to shake them out of their reality, they'll stay in it even if they're ultimately completely wrong.

[92] And I've always been stunned.

[93] that when I explain where we're going, what we're doing, and why almost everyone just says, that's interesting.

[94] And they continue doing what they're doing.

[95] And so, you know, I think it's pretty easy to do that.

[96] You know, but what you need is a huge data set.

[97] So, you know, before AI and people's focus on data, you know, I've sort of been doing this mostly my whole life.

[98] I'm not a scientist, let alone a computer scientist.

[99] And, you know, you can just hear what people are saying when somebody says something or you observe something that simply doesn't make sense.

[100] That's when you really go to work.

[101] The rest of it's just processing.

[102] You know, on a quick tangent, pattern recognition is a term often used throughout the history of AI.

[103] That's the goal of artificial intelligence is pattern recognition, right?

[104] but there's, I would say, various flavors of that.

[105] So usually pattern recognition refers to the process of the, what you said, dress and the lint on the dress.

[106] Pattern recognition is very good identifying the dress as looking at the pattern that's always there, that's very common, and so on.

[107] You almost refer to a pattern that's like an what's called outlier detection in computer science, right?

[108] The rare thing, the small thing.

[109] Now, AI is not often good at that.

[110] Do you, just almost philosophically, the kind of decisions you made in your life based scientifically, almost on data, do you think AI in the future will be able to do?

[111] Is it something that could be put down into code?

[112] Or is it still deeply human?

[113] it's tough for me to say since I don't have domain knowledge in AI to know everything that could or might occur I know sort of in my own case that most people don't see any of that right I just assumed it was motivational, you know, but it's also sort of, it's hard wiring.

[114] What are you wired or programmed to be finding or looking for?

[115] It's not what happens every day.

[116] That's not interesting, frankly.

[117] I mean, that's what people mostly do.

[118] I do a bunch of that too, because, you know, That's what you do in normal life.

[119] But I've always been completely fascinated by the stuff that doesn't fit.

[120] Or the other way of thinking about it, it's determining what people want without them saying it.

[121] That's a different kind of pattern.

[122] You can see everything they're doing.

[123] There's a missing piece.

[124] they don't know it's missing.

[125] You think it's missing, given the other facts you know about them.

[126] And you deliver that, and then that becomes, you know, sort of very easy to sell to them.

[127] To linger on this point a little bit, you've mentioned that in your family, when you were growing up, nobody raised their voice in anger or otherwise.

[128] And you said that this allows you to learn to listen and hear some interesting things.

[129] can you elaborate as you have been on that idea what do you hear about the world if you listen well you you have to listen really intensely to understand what people are saying as well as what people are intending because it's not necessarily the same thing and um people mostly give themselves away, no matter how clever they think they are.

[130] Particularly if you have the full array of inputs.

[131] In other words, if you look at their face, you look at their eyes, which are the window on the soul, it's very difficult to conceal what you're thinking.

[132] You look at facial expressions and posture.

[133] You listen to their voice which changes.

[134] You know, when it's, when you're talking about something you're comfortable with or not, are you speaking faster, is the amplitude of what you're saying higher?

[135] Most people just give away what's really on their mind.

[136] You know, they're, they're not that clever.

[137] They're busy spending their time thinking about what they're in the process of saying.

[138] And so if you just observe that, not in a hostile way, but just in an evocative way and just let them talk for a while, they'll more or less tell you almost completely what they're thinking, even the stuff they don't want you to know.

[139] And once you know that, of course, it's sort of easy to play that kind of game because they've already told you everything you need to know.

[140] And so it's easy to get to a conclusion if there's meant to be one, an area of common interest, since you know almost exactly what's on their mind.

[141] And so that's an enormous advantage as opposed to just walking in someplace and somebody telling you something and you believing what they're saying.

[142] There are so many different levels of communication.

[143] So a powerful approach to life you discuss in the book on the topic of listening and really hearing people is figuring out what the biggest problem bothering a particular individual group is and coming up with a solution to that problem and presenting them with the solution, right?

[144] In fact, you brilliantly describe a lot of simple things that most people just don't do.

[145] It's kind of obvious, find a problem that's bothering somebody deeply.

[146] And as you said, I think you've implied that they will usually tell you what the problem is.

[147] but can you talk about this process of seeing what the biggest problem for a person is, trying to solve it, and maybe a particularly memorable example?

[148] Sure.

[149] You know, if you know you're going to meet somebody, there are two types of situations, chance meetings, and the second is you know you're going to meet somebody.

[150] So let's take the easiest one, which is you know you're going to meet somebody.

[151] and you start trying to make pretend you're them.

[152] It's really easy.

[153] What's on their mind?

[154] What are they thinking about in their daily life?

[155] What are the big problems facing?

[156] So if they're, you know, to make it a really easy example, you know, make pretend, you know, they're like president of the United States.

[157] Doesn't have to be this president.

[158] It can be any president.

[159] So you sort of know what's more or less on their mind because the press keeps reporting it.

[160] And you see it on television.

[161] You hear it.

[162] People discuss it.

[163] So you know if you're going to be running into somebody in that kind of position, you sort of know what they look like already.

[164] You know what they sound like.

[165] You know what their voice is like.

[166] And you know what they're focused on.

[167] And so if you're going to meet somebody like that, What you should do is take the biggest unresolved issue that they're facing and come up with a few interesting solutions that basically haven't been out there or that you haven't heard anybody else I was thinking about.

[168] So just to give you an example, it was sort of in the early 1990s and I was invited to something at the White House, which was a big deal for me because I was like, you know, a person from no place.

[169] And, you know, I had met the president once before because it was President Bush because his son was in my dormitory.

[170] So I had met him at Parents' Day.

[171] I mean, it's just like the oddity of things.

[172] So I knew I was going to see him because, you know, that's where the invitation came from.

[173] And so there was something going on.

[174] And I just thought about, you know, two or three ways to approach that issue.

[175] And, you know, at that point, I was separated, and so I had brought a date to the White House.

[176] And so I saw the president, and we sort of went over in a corner for about 10 minutes and discussed whatever this issue was.

[177] And I later, you know, went back to my date.

[178] It was a little rude, but it was meant to be confidential conversation.

[179] And I barely knew her.

[180] And, you know, she said, what were you talking about all that?

[181] time.

[182] I said, well, you know, there's something going on in the world, and I've thought about different ways of perhaps approaching that, and he was interested.

[183] And the answer is, of course, he was interested.

[184] Why wouldn't he be interested?

[185] There didn't seem to be an easy outcome.

[186] And so, you know, conversations of that type, once somebody knows, you're really thinking about what's good for them and good for the situation.

[187] It has nothing to do with with me. I mean, it's really about being in service, you know, to this situation.

[188] Then people trust you and they'll tell you other things because they know your motives are basically very pure.

[189] You're just trying to resolve a difficult situation and help somebody do it.

[190] So these types of things, you know, that's a planned situation.

[191] That's easy.

[192] Sometimes you just come upon somebody and they start talking.

[193] And, you know, that requires, you know, like different skills.

[194] You know, you can ask them, what have you been working on lately?

[195] What are you thinking about?

[196] You can ask them, you know, has anything been particularly difficult?

[197] And, you know, you can ask them most people if they trust you for some reason.

[198] They'll tell you.

[199] And then you have to instantly go to work on it.

[200] and you know that's that's not as good as having some advanced planning but but you know almost everything going on is is like out there and and people who are involved with interesting situations um they're playing in the same ecosystem they just have different roles in in the system.

[201] And, you know, you can do that with somebody who owns a pro football team that loses all the time.

[202] We specialize in those in New York.

[203] And, you know, you already have analyzed why they're losing, right?

[204] Inevitably, it's because they don't have a great quarterback, they don't have a great coach, and they don't have a great general manager who knows how to hire the best talent.

[205] Those are the three reasons why a team fails, right?

[206] Because there are salary caps.

[207] So every team pays a some amount of money for all their players.

[208] So it's got to be those three positions.

[209] So if you're talking with somebody like that, inevitably, even though it's not structured, you'll know how their team's doing and you'll know pretty much why.

[210] And if you start asking questions about that, they're typically very happy to talk about it because they haven't solved that problem.

[211] In some case, they don't even know that's the problem.

[212] It's pretty easy to see it.

[213] So, you know, I do stuff like that, which I find is intuitive as a process, but, you know, leads to really good results.

[214] Well, the funny thing is when you're smart, for smart people, it's hard to escape their own ego and the space of their own problems, which is what's required to think about other people's problems.

[215] It requires for you to let go of the fact that your own problems are all important.

[216] And then to talk about your, I think while it seems obvious, and I think quite brilliant, it's a difficult leap for many people, especially smart people to empathize with truly empathize with the problems of others well i have a competitive advantage which is which is i don't think i'm so smart so good so you know it's not a problem for me well the truly smartest people i know say that exact same thing yeah being humble is uh is really useful competitive advantage as you said uh how do you stay humble well i i i'm a changed much.

[217] Since...

[218] Since I was in my mid -teens, you know, I was raised partly in the city and partly in the suburbs.

[219] And, you know, whatever the values I had at that time, those are still my values.

[220] I call them like middle -class values.

[221] That's how I was raised.

[222] And I've never changed.

[223] why would I?

[224] That's who I am.

[225] And so the accoutrement of, you know, the rest of your life has got to be put on the same, you know, like solid foundation of who you are.

[226] Because if you start losing who you really are, who are you?

[227] So I've never had the desire to be somebody else.

[228] I just do other things now that I wouldn't do as a, you know, sort of as a middle class kid.

[229] from Philadelphia.

[230] I mean, my life has morphed on a certain level, but part of the strength of having integrity of personality is that you can remain in touch with everybody who comes from that kind of background.

[231] And, you know, even though I do some things that aren't like that, you know, in terms of people I'd meet or situations I'm in, I always look at it through the same lens.

[232] And that's very psychologically comfortable and doesn't require to me to make any real adjustments in my life.

[233] And I just keep plowing ahead.

[234] There's a lot of activity in progress in recent years around effective altruism.

[235] I wanted to bring this topic with you because it's an interesting one from your perspective.

[236] You can put it in any kind of terms, but it's philanthropy that focuses on maximizing impact.

[237] How do you see the goal of philanthropy, both from a personal motivation perspective and a societal big picture impact perspective?

[238] Yeah, I don't think about philanthropy the way you would expect me to, okay?

[239] I look at, you know, sort of solving big issues, addressing big issues, starting new organizations to do it, much like we do in our business.

[240] You know, we keep growing our business, not by taking the original thing and making it larger, but continually seeing new things and building those.

[241] And, you know, sort of marshalling financial resources, human resources.

[242] And in our case, because we're in the investment business, we find something new that looks like it's going to be terrific.

[243] And we do that, and it works out really well.

[244] all I do in what you would call philanthropy is look at other opportunities to help society.

[245] And I end up starting something new, marshalling people, marshalling a lot of money.

[246] And then at the end of that kind of creative process, so somebody typically asks me to write a check.

[247] I don't wake up and say, how can I, like, give large amounts money away?

[248] I look at issues that are important for people.

[249] In some cases, I do smaller things because it's important to a person.

[250] And, you know, I have, you know, sort of, I can relate to that person.

[251] There's some unfairness that's happened to them.

[252] And so in situations like that, I'd give money anonymously and help them out.

[253] And, you know, that's, it's like a miniature version.

[254] of addressing something really big.

[255] So, you know, at MIT, I've done a big thing, you know, helping to start this new school of computing.

[256] And I did that because, you know, I saw that, you know, there's sort of like a global race on in AI, quantum, and other major technologies.

[257] And I thought that the U .S. could use more enhancement from a competitive perspective.

[258] And I also, because I get to China a lot and I travel around a lot compared to a regular person, you know, I can see the need to have control of these types of technologies.

[259] So when they're introduced, we don't create a mess like we did with the Internet and with social media, unintended consequence, you know, that's creating all kinds of issues and freedom of speech and the functioning of liberal democracies.

[260] So with AI, it was pretty clear that there was enormous difference of views around the world by the relatively few practitioners in the world who really knew what is going on.

[261] And by accident, I knew a bunch of these people, you know, who were like big famous people.

[262] And I could talk to them and say, why do you think this is a force for bad?

[263] And someone else, why do you feel this is a force for good?

[264] And how do we move forward with the technology, by the same time, make sure that whatever is potentially, you know, sort of on the bad side of this technology with, you know, for example, disruption of workforces and things like that, that could happen much faster than the Industrial Revolution?

[265] what do we do about that and how do we keep that under control so that the really good things about these technologies which will be great things not good things are allowed to happen so so to me you know this this was one of the great issues facing society the number of people who were aware of it were very small I just accidentally got sucked into it.

[266] And as soon as I saw it, I went, oh, my God, this is mega, both on a competitive basis globally, but also in terms of protecting society and benefiting society.

[267] So that's how I got involved.

[268] And at the end, you know, sort of the right thing that we figured out was, you know, sort of double MIT's computer science faculty.

[269] and basically create the first AI -enabled university in the world and, you know, in effect, be an example, a beacon to the rest of the research community around the world academically and create, you know, a much more robust U .S. situation, competitive situation among the universities because if MIT was going to raise a lot of money and double its faculty, where you could bet that a number of other universities were going to do the same thing.

[270] At the end of it, it would be great for knowledge creation, you know, great for the United States, great for the world.

[271] And so I like to do things that I think are really positive things that other people aren't on that I see for whatever the reason.

[272] First, it's just people I meet and what they say and I can recognize when something really profound is about to happen or needs to.

[273] And I do it.

[274] And at the end of the situation, somebody says, can you write a check to help us?

[275] And then the answer is sure.

[276] I mean, because if I don't, the vision won't happen.

[277] But it's the vision of whatever I do.

[278] that is compelling.

[279] And essentially, I love that idea of whether it's small at the individual level or really big, like the gift to MIT to launch the College of Computing, it starts with a vision and you see philanthropy as the biggest impact you can have is by launching something new, especially on an issue that others aren't really addressing.

[280] And I also love the notion, and you're absolutely right, that there's other universities, Stanford's CMU, I'm looking at you, that would essentially the seed will create other, it'll have a ripple effect that potentially might help US be a leader or continue to be a leader in AI, this potentially very transformative research direction.

[281] Just to linger on that point a little bit, what is your hope long -term for the impact, the college here at MIT might have the next 5, 10, even 20, or let's get crazy 30, 50 years?

[282] Well, it's very difficult to predict the future when you're dealing with knowledge production and creativity.

[283] You know, MIT has obviously some unique aspects, you know, globally.

[284] And, you know, there's four big sort of academic surveys.

[285] I forget whether it was QS.

[286] There's the Times in London, you know, the U .S. news and whatever.

[287] One of these recently, MIT was ranked number one in the world, right?

[288] So leave aside whether you're not.

[289] number three somewhere else, in the great sweep of humanity, this is pretty amazing, right?

[290] So you have a really remarkable aggregation of human talent here.

[291] And where it goes, it's hard to tell.

[292] You have to be a scientist to have the right feel.

[293] But what's important is you have a critical mass of people.

[294] I think it breaks into two buckets.

[295] One is scientific advancement.

[296] And if the new college can help, you know, sort of either serve as a convening force within the university or help sort of coordination and communication among people, that's a good thing.

[297] Absolute good thing.

[298] The second thing is in the AI ethics area, which is in a way equally important because if the science side creates blowback so that science is is, you know, a bit crippled in terms of going forward because society's reaction.

[299] to knowledge advancement in this field becomes really hostile, then you've sort of lost the game in terms of scientific progress and innovation.

[300] And so the AI ethics piece is super important because, you know, in a perfect world, MIT would serve as a global convener because what you need is you need the research, universities.

[301] You need the companies that are driving AI and quantum work.

[302] You need governments who will ultimately be regulating certain elements of this.

[303] And you also need the media to be knowledgeable and trained.

[304] So we don't get sort of overreactions to one situation.

[305] which then goes viral and it ends up shutting down avenues that are perfectly fine, you know, to be walking down or running down that avenue.

[306] But if enough discordant information, not even correct necessarily, you know, sort of gets, you know, sort of gets pushed around society, then you can end up with a really hostile regulatory environment and other things.

[307] So you have four drivers that have to be sort of integrated.

[308] And so if the new school of computing can be really helpful in that regard, then that's a real service to science and it's a service to MIT.

[309] So that's why I wanted to get involved for both areas.

[310] And the hope is, for me, for others, for everyone, for the world, is for this particular college of computing to be a beacon and a connector for these ideas.

[311] Yeah, that's right.

[312] I mean, I think MIT is perfectly positioned to do that.

[313] So you've mentioned the media, social media, the Internet, as this complex network of communication with flaws, perhaps.

[314] Perhaps you can speak to them.

[315] But, you know, I personally think that science and technology has its flaws, but ultimately is, one, sexy, exciting.

[316] It's the way for us to explore and understand the mysteries of our world.

[317] and two most perhaps more importantly for some people it's a huge way to a really powerful way to grow the economy to improve the quality of life for everyone so how do we get how do you see the media social media the internet as a society having uh you know healthy discourse about science first of all one that's factual and to one that's find science exciting, that invest in science, that pushes it forward, especially in this science fiction, fear -filled field of artificial intelligence?

[318] Well, I think that's a little above my pay grade because, you know, trying to control social media to make it do what you want to do.

[319] Sure.

[320] It appears to be beyond almost anybody's control.

[321] And the technology is being used to create what I called the tyranny of the minorities, okay?

[322] A minority is defined as, you know, two or three people on a street corner.

[323] It doesn't matter what they look like.

[324] It doesn't matter where they came from.

[325] They're united by that one issue that they care about, and their job is to enforce their views on the world.

[326] And, you know, in the political world, people just are men.

[327] manufacturing truth, and they throw it all over, and it affects all of us.

[328] And, you know, sometimes people are just hired to do that.

[329] I mean, it's amazing.

[330] And you think it's one person, it's really, you know, just sort of a front, you know, for a particular point of view.

[331] And this has become exceptionally disruptive for, society and it's dangerous and it's undercutting, you know, the ability of liberal democracies to function.

[332] And I don't know how to get a grip on this.

[333] And I was really surprised when we, you know, it was up here for the announcement last spring of the College of Computing.

[334] And they had all these famous scientists, some of whom were involved with the invention.

[335] of the internet, and almost every one of them got up and said, I think I made a mistake.

[336] And as a non -scientist, I never thought I'd hear anyone say that.

[337] And what they said is, more or less, to make it simple, we thought this would be really cool inventing the internet.

[338] We could connect everyone in the world.

[339] We can move knowledge around.

[340] It was instantaneous.

[341] It's a really amazing thing.

[342] He said, I don't know that there was anyone who ever thought about social media coming out of that and the actual consequences for people's lives.

[343] You know, there's always some younger person.

[344] I just saw one of these yesterday.

[345] It was reported on the national news who killed himself when people use social media to basically, you know, sort of rid of.

[346] ridicule him or something of that type.

[347] This is dead.

[348] This is dangerous.

[349] And, you know, so I don't have a solution for that other than going forward.

[350] You can't end up with this type of outcome using AI.

[351] To make this kind of mistake twice is unforgivable.

[352] So interestingly, at least in the West, and parts of China, people are quite sympathetic to, you know, sort of the whole concept of AI ethics and what gets introduced when and cooperation within your own country, within your own industry, as well as globally, to make sure that the technology is forced for good.

[353] on that really interesting topic since 2007 you've had a relationship with senior leadership with a lot of people in china and an interest in understanding modern china their culture their world much like with russia i'm from russia originally americans are told a very narrow one -sided story about china that i'm sure misses a lot of fascinating complexity both positive and negative what lessons about Chinese culture, its ideas as a nation, as future, do you think Americans should know about, deliberate on, think about?

[354] Well, it's sort of a wide question that you're asking about.

[355] You know, China's a pretty unusual place.

[356] At first, it's huge.

[357] You know, it's physically huge.

[358] It's got a billion three people.

[359] and the character of the people isn't as well understood in the United States.

[360] Chinese people are amazingly energetic.

[361] If you're one of a billion three people, one of the things you've got to be focused on is how do you make your way through a crowd of a billion 2 .990.

[362] 9999 other people.

[363] Another word for that is competitive.

[364] Yes, they are individually, highly energetic, highly focused, always looking for some opportunity for themselves because they need to, because there's an enormous amount of just literally people around.

[365] And so what I've found is they'll try and find a way to win for themselves and their country is complicated because it basically doesn't have the same kind of functional laws that we do in the United States in the West and the country is controlled really through a web of relationships you have with other people and the relationships that those other people have with other people.

[366] So it's an incredibly dynamic culture where if somebody knocks somebody up on the top who's three levels above you and is in effect protecting you, then, you know, you're like a, you know, sort of a floating molecule there, you know, without tethering except the one or two layers above you, but that's going to get affected.

[367] So it's a very dynamic system and getting people to change is not that easy because if there aren't really functioning laws, it's only the relationships that everybody has.

[368] And so when you decide to make a major change and you sign up for it, something is changing in your life.

[369] There won't necessarily be all the same people on your team.

[370] And that's a very high risk enterprise.

[371] So when you're dealing with China, it's important to know almost what everybody's relationship is with somebody.

[372] So when you suggest doing something differently, you line up these forces in the West, it's usually you talk to a person and they figure out what's good for them.

[373] it's a lot easier and in that sense in a funny way it's easier to make change in the west just the opposite of what people think but but once the Chinese system adjust to something that's new everybody's on the team it's hard to change them but once they're changed they are incredibly focused in a way that it's hard for the West to do in a more individualistic culture.

[374] So there are all kinds of fascinating things.

[375] One thing that might interest, you know, the people who are listening were more technologically based than some other group.

[376] I was with one of the top people in the government a few weeks ago, and he was telling me that every school child in China is going to be taught computer science.

[377] Now imagine 100 % of these children.

[378] This is such a large number of human beings.

[379] Now, that doesn't mean that every one of them will be good at computer science, but if it's sort of like in the West, if it's like math or English, everybody's going to take it.

[380] Not everybody's great at English, they don't write books, they don't write poetry, and not everybody's good at math.

[381] You know, somebody like myself, I sort of evolved to the third grade, and I'm still doing flashcards.

[382] You know, I didn't make it further in math.

[383] But imagine everybody in their society is going to be involved with computer science.

[384] I just even pause on that.

[385] I think computer science involves at the basic beginner level of programming and the idea that everybody in the society would have some ability to program a computer is incredible.

[386] For me, it's incredibly exciting, and I think that should give United States pause and consider what, talking about philanthropy, launching things, there's nothing like launching, sort of investing in young, the youth, the education system, because that's where everything launches.

[387] Yes.

[388] Well, we've got a complicated system because we have over 3 ,000 school districts around the country.

[389] China doesn't worry about that as a concept.

[390] They make a decision at the very top of the government that that's what they want to have happen and that is what will happen and we're really handicapped by this distributed you know power in the education area although some people involved with that area will think it's great but you know you would know better than I do what percent of American children have computer science exposure my guess no knowledge would be 5 % or less and if we're going to be going into a world where the other major economic power sort of like ourselves has got like 100 % and we got 5 and the whole computer science area is the future then we're purposely or accidentally actually handicapping ourselves and our system doesn't allow us to adjust quickly to that.

[391] So, you know, the issues like this I find fascinating.

[392] And, you know, if you're lucky enough to go to other countries, which I do, and you learn what they're thinking, then it informs what we ought to be doing in the United States.

[393] So the current administration, Donald Trump, has released an executive order on artificial intelligence.

[394] I'm not sure if you're familiar with it in 2019.

[395] Looking several years ahead, how does America sort of, we've mentioned in terms of the big impact, we hope your investment in MIT will have a ripple effect, but from a federal perspective, from a government perspective, how does America establish, with respect to China, leadership in the world at the top for research and development in AI?

[396] I think that you have to get the federal government in the game in a big way, and that this leap forward technologically, which is going to happen with or without us, you know, really should be with us.

[397] And it's an opportunity, in effect, for another moonshot kind of mobilization by the United States.

[398] I think the appetite actually is there to do that.

[399] At the moment, what's getting in the way is the kind of poisonous politics we have.

[400] But if you go below the lack of cooperation, which is almost the defining element of American democracy right now in the Congress, if you talk to individual members, they get it and they would like to do something.

[401] Another part of the issue is we're running huge deficits.

[402] We're running trillion -dollar -plus deficits.

[403] So how much money do you need for this initiative?

[404] Where does it come from?

[405] Who's prepared to stand up for it?

[406] Because if it involves taking away resources from another area, our political system is not real flexible to do that.

[407] If you're creating this kind of initiative, which we need, where does the money come from and trying to get money when you've got trillion dollar deficits in a way it could be easy what's the difference of a trillion and a trillion a little more but but you know it's hard with the mechanisms of Congress but what's what's really important is this is not an issue that is unknown and it's viewed as a very important issue and there's almost no one in the Congress when you sit down and explain what's going on who doesn't say we've got to do something.

[408] Let me ask the impossible question.

[409] So you didn't endorse Donald Trump, but after he was elected, you have given him advice, which seems to me a great thing to do, no matter who the president is, to contribute, positively contribute to this nation by giving advice.

[410] And yet, you've received a lot of criticism for this.

[411] So on the previous topic of science and technology and government, how do we have a healthy discourse, give advice, get excited conversation with the government about science and technology without it becoming politicized?

[412] Well, it's very interesting.

[413] So when I was young, before there was a moonshot, we had a president named John F. Kennedy from Massachusetts here.

[414] And in his inaugural address as president, he has not what your country can do for you, but what you can do for your country.

[415] You know, we had a generation of people, my age, basically people, who grew up with that credo.

[416] And, you know, sometimes you don't need to innovate.

[417] You can go back to basic principles.

[418] And that's a good basic principle.

[419] What can we do?

[420] You know, Americans have GDP per capita of around $60 ,000.

[421] you know not every it's not equally distributed but it's big and you know people have I think an obligation to help their country and I do that and apparently I take some grief from some people you know who project on me things I don't even face believe but I'm like quite simple you know I tried to help the previous President Obama he was a good guy and he was a different party and I tried to help President Bush and he's a different party and you know I I sort of don't care that much about what the parties are I care about even though I'm a big donor for the Republicans but it's It's, what motivates me is what are the problems we're facing.

[422] Can I help people get to, you know, sort of a good outcome that'll stand any test?

[423] But we live in a world now where, you know, sort of the filters and the hostility is so unbelievable, you know, in the 1960s.

[424] When I went to school, in the university, I went to Yale.

[425] We had, like, so much stuff going on.

[426] We had a war called the Vietnam War.

[427] We had, you know, sort of black power starting.

[428] And, you know, we had a sexual revolution with the birth control pill.

[429] And, you know, there was one other major thing going on.

[430] Right, the drug revolution.

[431] There hasn't been a generation that had more stuff going on in a four -year period than my era.

[432] Yet, there wasn't this kind of instant hostility if you believed something different.

[433] Everybody lived together and, you know, respected the other person.

[434] person.

[435] And I think that, you know, this type of change needs to happen.

[436] And it's got to happen from the leadership of our major institutions.

[437] And I don't think that leaders can be bullied by people who are against, you know, sort of the classical version of free speech and letting open expression and inquiry.

[438] That's what universities are.

[439] are for, among other things, Socratic methods, and so I have, in the midst of this like onslaught of oddness, I believe in still the basic principles, and we're going to have to find a way to get back to that.

[440] And that doesn't start with the people, you know, sort of in the middle to the bottom who are using, you know, these kinds of screens to shout people down and create an uncooperative environment.

[441] It's got to be done at the top with core principles that are articulated.

[442] And ironically, if people don't sign on to these kind of core principles where people are equal and, you know, speech can be heard.

[443] And, you know, you don't.

[444] have these enormous shout -down biases subtly or out loud, then they don't belong at those institutions.

[445] They're violating the core principles.

[446] And, you know, that's how you end up making change, but you have to have courageous people who are willing to lay that out for the benefit of not just their institutions, but for society as a whole.

[447] So I believe that will happen, but it needs the commitment of senior people to make it happen.

[448] Courage, and I think for such great leaders, great universities, there's a huge hunger for it.

[449] So I'm too very optimistic that it will come.

[450] I'm now personally taking a step into building a startup first time, hoping to change the world, of course.

[451] there are thousands, maybe more, maybe millions of other first -time entrepreneurs like me. What advice you've gone to this process?

[452] You've talked about the suffering, the emotional turmoil, it all might entail.

[453] What advice do you have for those people taking that step?

[454] I'd say it's a rough ride, and you have to be psychologically prepared for things.

[455] things going wrong with frequency.

[456] You have to be prepared to be put in situations where you're being asked to solve problems.

[457] You didn't even know those problems existed.

[458] For example, renting space, it's not really a problem unless you've never done it.

[459] You have no idea what a lease looks like, right?

[460] You don't even know the relevant rent and, you know, in a market.

[461] So everything is new.

[462] Everything has to be learned.

[463] What you realize is that it's good to have other people with you who've had some experience in areas where you don't know what you're doing.

[464] Unfortunately, an entrepreneur starting doesn't know much of anything.

[465] So everything is something new.

[466] And I think it's important not to be alone because it's sort of overwhelming and you need somebody to talk to other than a spouse or a loved one because even they get bored with your problems.

[467] And so, you know, getting a group, you know, if you look at Alibaba, you know, Jack Ma was telling me they went, they basically were like a financial death store at least twice.

[468] and, you know, the fact that there, it wasn't just Jack, I mean, people think it is because, you know, he became the, you know, the sort of public face and the driver, but, but a group of people who can give advice, share situations to talk about, that's really important.

[469] And that's not just referring to the small details like renting space.

[470] No. It's also the psychological.

[471] Yes.

[472] Pardon.

[473] Yeah, and, you know, because most entrepreneurs at some point question what they're doing because it's not going so well, or they're screwing it up and they don't know how to unscrew it up because we're all learning.

[474] And it's hard to be learning, you know, when they're like 25 variables going on.

[475] If you, you know, if you're missing four big ones, you can really make a mess.

[476] And so the ability to, in effect, have either an outsider, who's really smart that you can rely on for certain type of things or other people who are working with you on a daily basis.

[477] It's most people who haven't had experience believe in the myth of the one person, one great person, you know, makes outcomes, creates outcomes that are positive.

[478] Most of us, it's not like that.

[479] If you look back, over a lot of the big successful tech companies, it's not typically one person.

[480] And you will know these stories better than I do because it's your world, not mine, but even I know that almost every one of them had two people.

[481] I mean, if you look at Google, you know, that's what they had, and that was the same at Microsoft at the beginning.

[482] And, you know, it was the same at Apple.

[483] You know, people have different skills.

[484] and they need to play off of other people.

[485] So, you know, the advice that I would give you is make sure you understand that, so you don't head off in some direction as a lone wolf and find that either you can invent all the solutions or you make bad decisions on certain types of things.

[486] This is a team sport.

[487] entrepreneur means you're alone in effect and that's the myth but it's mostly a myth yeah i think and you talk about this in your book and i could talk to you about it forever the the harshly self -critical aspect to your personality and uh to mine as well in the face of failure it's a powerful tool but it's also a burden uh that's that's very interesting uh very interesting to walk that line.

[488] But let me ask in terms of people around you, in terms of friends, in the bigger picture of your own life, where do you put the value of love, family, friendship in the big picture journey of your life?

[489] Well, ultimately all journeys are alone.

[490] It's great to have support.

[491] And, you know, when you go forward and say, your job is to make something work, and that's your number one priority, and you're going to work out it to make it work, you know, it's like superhuman effort.

[492] People don't become successful as part -time workers.

[493] It doesn't work that way.

[494] And if you're prepared to make that 100 to 120 % effort, you're going to need support and you're going to have to have people involved with your life who understand that that's really part of your life.

[495] Sometimes you're involved with somebody and, you know, they don't really understand that and that's a source of, you know, sort of conflict and difficulty.

[496] But if you're involved with the right people, you know, whether it's a sort of dating relationship or, you know, sort of spousal relationship, you know, you have to involve them in your life, but not burden them with every, you know, sort of minor triumph or mistake.

[497] they actually get bored with it after a while.

[498] And so you have to set up different types of ecosystems.

[499] You have your home life, you have your love life, you have children, and that's like the enduring part of what you do.

[500] And then on the other side, you've got the sort of unpredictable nature of this type of work.

[501] What I say to people at my firm who were younger, usually, well, everybody's younger, but, you know, people who are of an age where, you know, they're just having their first child, or maybe they have two children, that it's important to make sure they go away with their spouse at least once every two months to just some lovely place where there are no children, no issues, sometimes once a month if they're sort of energetic and clever and that...

[502] Escape the craziness of it all?

[503] Yeah, and reaffirm your values as a couple.

[504] And you have to have fun, If you don't have fun with the person you're with and all you're doing is dealing with issues, then that gets pretty old.

[505] And so you have to protect the fun element of your life together.

[506] And the way to do that isn't by hanging around the house and dealing with, you know, sort of more problems.

[507] You have to get away and reinforce and reinvigorate your relationship.

[508] and whenever I tell one of our younger people about that, they sort of look at me and it's like the scales are falling off of their eyes and they're saying, geez, you know, I hadn't thought about that.

[509] You know, I'm so enmeshed in all these things, but that's a great idea.

[510] And that's something as an entrepreneur.

[511] You also have to do.

[512] You just can't let relationships slip because you're half overwhelmed.

[513] Beautifully put, and I think there's no better place to end it.

[514] Steve, thank you so much.

[515] I really appreciate it.

[516] It was an honor to talk to you.

[517] My pleasure.

[518] Thanks for listening to this conversation with Stephen Schwartzman, and thank you to our sponsors, ExpressVPN and Masterclass.

[519] Please consider supporting the podcast by signing up to Masterclass and Masterclass .com slash Lex and getting ExpressVPN at ExpressVPN .com slash LexPod.

[520] If you enjoy this podcast, subscribe on YouTube, review it with five stars in Apple Podcast.

[521] support it on Patreon or simply connect with me on Twitter at Lex Friedman.

[522] And now let me leave you with some words from Stephen Schwartzman's book What It Takes.

[523] It's as hard to start and run a small business as it is to start a big one.

[524] You will suffer the same toll financially and psychologically as you bludgeon it into existence.

[525] It's hard to raise the money and to find the right people.

[526] So if you're going to dedicate your life to a business, which is the only way it will ever work.

[527] You should choose one with a potential to be huge.

[528] Thank you for listening and hope to see you next time.