Insightcast AI
Home
© 2025 All rights reserved
Impressum
#148 – Charles Isbell and Michael Littman: Machine Learning and Education

#148 – Charles Isbell and Michael Littman: Machine Learning and Education

Lex Fridman Podcast XX

--:--
--:--

Full Transcription:

[0] The following is a conversation with Charles Isbell and Michael Whitman.

[1] Charles is the dean of the College of Computing at Georgia Tech, and Michael is a computer science professor at Brown University.

[2] I've spoken with each of them individually on this podcast, and since they are good friends in real life, we all thought it would be fun to have a conversation together.

[3] Quick mention of each sponsor, followed by some thoughts related to the episode.

[4] Thank you two.

[5] Athletic Greens, the only one drink that I start every day with to cover all my nutritional bases, eight sleep, a mattress that cools itself and gives me yet another reason to enjoy sleep, masterclass, online courses from some of the most amazing humans in history and cash app, the app I use to send money to friends.

[6] Please check out the sponsors in the description to get a discount and to support this podcast.

[7] As a side note, let me say that having two guests on the podcast is an experiment that I've been meaning to do for a while.

[8] In particular, because down the road, I would like to occasionally be a kind of moderator for debates between people that may disagree in some interesting ways.

[9] If you have suggestions for who you would like to see debate on this podcast, let me know.

[10] As with all experiments of this kind, it is a learning process.

[11] Both the video and the audio might need improvement.

[12] I realized I think I should probably do three or more cameras next time as opposed to just two.

[13] And all, also try different ways to mount the microphone for the third person.

[14] Also, after recording this intro, I'm going to have to go figure out the thumbnail for the video version of the podcast, since I usually put the guest's head on the thumbnail, and now there's two heads and two names to try to fit into the thumbnail.

[15] It's a kind of bin -packing problem, which in theoretical computer science happens to be an NP hard problem.

[16] Whatever I come up with, if you have better ideas for the thumbnail, let me know as well.

[17] And in general, I always welcome ideas how this thing can be improved.

[18] If you enjoy it, subscribe on YouTube, review it with five stars on Apple podcast, follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman.

[19] As usual, I'll do a few minutes of ads now and no ads in the middle.

[20] I try to make these interesting, but I do give you time stamps so you can go ahead and skip if you must, but please do check out the sponsors by clicking the links in the description.

[21] It's the best way to support this podcast.

[22] This show is sponsored by Athletic Greens, the all -in -one daily drink to support better health and peak performance.

[23] It replaced the multivitamin for me and went far beyond that with 75 vitamins and minerals.

[24] I do intermittent fasting of 16 to 24 hours every day and always break my fast with athletic greens.

[25] I can't say enough good things about these guys.

[26] It helps me not worry whether I'm getting all the nutrients I need.

[27] One of the many reasons I'm a fan is that they keep iterating on their formula.

[28] I love continuous improvement.

[29] Life is not about reaching perfection.

[30] It's about constantly striving for it and making sure each iteration is a positive delta.

[31] The other thing I've taken for a long time outside of Athletic Greens is fish oil.

[32] So I'm especially excited, even though I genetically don't seem to be capable of generating the sound of excitement with my voice.

[33] I'm excited now that they're selling fish oil and are offering listeners of this podcast free one month's supply of wild caught omega -3 fish oil.

[34] when you go to Athleticreens .com slash Lex to claim this special offer.

[35] Click Athletic Greens .com slash Lex in the description to get the fish oil and the all -in -one supplement I rely on for the nutritional foundation of my physical and mental performance.

[36] This episode is also sponsored by 8Sleep and its Pod Pro mattress.

[37] It controls temperature with an app.

[38] It's packed with sensors.

[39] It can cool down to as low as 55 degrees on each side of the bed separately.

[40] It's been a game changer for me. I just enjoy sleep and power naps more.

[41] I feel like I fall asleep faster and get more restful sleep.

[42] Combination of cool bed and warm blanket is amazing.

[43] Now, if you love your current mattress, but still looking for temperature control, A -sleep's new Pod Pro cover adds dynamic cooling and heating capabilities onto your current mattress.

[44] It can cool down to 55 degrees or heat up to 110 degrees and do so on each side of the bed separately.

[45] It's magic, really.

[46] Also, it can track a bunch of metrics like heart rate variability, but honestly, cooling alone is worth the money.

[47] Go to asleep .com slash Lex, and when you buy stuff there during the holidays, you get special savings as listeners of this podcast.

[48] and you know the savings is special because I use the word special.

[49] Again, that's 8Sleep .com slash Lex.

[50] This show is also sponsored by Masterclass, $180 a year for an all -access pass to watch courses from literally the best people in the world on a bunch of different topics.

[51] Let me list some that I have watched and enjoyed.

[52] Chris Hatfield on Space Exploration, Neil DeGrasse Tyson, on scientific thinking communication.

[53] I probably should get Neil on this podcast soon.

[54] Will Wright, creator of SimCity and Sims on game design, Carlos Santana on guitar.

[55] I'm working on Europa right now, actually.

[56] Gary Kasparov on chess, Daniel Negrano on poker, Neil Gaiman on storytelling, Martin Scorsese on filmmaking, Tony Hawk on skateboarding, and Jane Goodall on conservation.

[57] By the way, you can watch it on basically any device, sign up at masterclass .com slash lex for the buy one get one free membership for you and a friend that's masterclass .com slash lex this show is presented by cash app the number one finance app in the app store when you get it use code lex podcast cash app lets you send money to friends buy bitcoin and invest in the stock market with as little as $1.

[58] In fact, just yesterday, I think, I tweeted that the Mars economy will run on cryptocurrency.

[59] I do believe that's true.

[60] It's kind of the obvious trajectory, but it's also fun to talk about.

[61] And I wonder what that cryptocurrency will be.

[62] Right now, Bitcoin and Ethereum seem to be dominating the space, but who knows what 10, 20, 30, 50, 100 years from now it looks like.

[63] Anyway, I hope to talk to a bunch of folks from the cryptocurrency space on this podcast soon, including once again the great, the powerful metallic buterin.

[64] So again, if you get Cash App from the App Store, Google Play, and use Code Lex, podcast, you get $10, and CashUp will also donate $10 to first, an organization that is helping to advance robotics and STEM education for young folks around the world.

[65] And now here's my conversation with Charles Isbell and Michael Lidman.

[66] You'll probably disagree about this question, but what is your biggest, would you say disagreement about either something profound and very important or something completely not important at all?

[67] I don't think you have any disagreements at all.

[68] I'm not sure that's true.

[69] We walked into that one, didn't we?

[70] So one thing that you sometimes mention is that, and we did this one on air too, as it were, whether or not machine learning is computational statistics.

[71] It's not.

[72] But it is.

[73] Well, it's not.

[74] And in particular, and more importantly, it is not just computational statistics.

[75] So what's missing in the picture?

[76] All the rest of it.

[77] What's missing?

[78] That which is missing.

[79] Oh, well, you can't be wrong now.

[80] Well, it's not just the statistics.

[81] He doesn't even believe this.

[82] We've had this conversation before.

[83] If it were just the statistics, then we would be happy with where we are.

[84] But it's not just the statistics.

[85] That's why it's computational statistics.

[86] Or if it were just the computational.

[87] I agree that machine learning is not just statistics.

[88] It is not just statistics.

[89] We can agree on that.

[90] No, is it just computational statistics.

[91] It's computational statistics.

[92] It is computational.

[93] What is the computational and computational statistics?

[94] Does this take us into the realm of computing?

[95] It does, but I think perhaps the way I can get him to admit that he's wrong.

[96] Is that it's about rules.

[97] It's about rules.

[98] It's about symbols.

[99] It's about all these other things.

[100] But statistics is not about rules.

[101] I'm going to say statistics is about rules.

[102] But it's not just the statistics, right?

[103] It's not just a random variable that you choose and you have a probability.

[104] I think you have a narrow view of statistics.

[105] Okay, well, then what would be the broad, of statistics that would still allow it to be statistics and not say history that would make computational statistics okay.

[106] Well, okay.

[107] So I had my first sort of research mentor, a guy named Tom Landauer, taught me to do some statistics, right?

[108] Sure.

[109] And I was annoyed all the time because the statistics would say that what I was doing was not statistically significant.

[110] And I was like, but, but, but, but, and basically what he said to me is, statistics is how you're going to keep from lying to yourself.

[111] which I thought was really deep.

[112] It is a way to keep yourself honest in a particular way.

[113] I agree with that.

[114] Yeah.

[115] And so you're trying to find rules.

[116] I'm just kind of bringing back to rules.

[117] Wait, wait, wait.

[118] Could you possibly try to define rules?

[119] Even regular statisticians, non -computational statisticians, do spend some of their time evaluating rules, right?

[120] Applying statistics to try to understand, does this, you know, does this rule capture this?

[121] Does this not capture that?

[122] I mean like hypothesis testing kind of thing or like confidence intervals like like have like more like hypothesis like I feel like the word statistic literally means like a summary like a number that summarizes other numbers right but I think the field of statistics actually applies that idea to things like rules to understand whether or not a rule is valid the software engineering statistics no programming language is statistics no because I think there's a very it's useful to think about a lot of what AI machine learning is or certainly should be as software engineering, as programming languages.

[123] Just if to put it in language that you might understand, the hyperparameters beyond the problem.

[124] The hyperparameters has too many syllables from you to understand.

[125] The hyperparameters of, that goes around it, right?

[126] It's the decisions you choose to make.

[127] It's the metrics you choose to use.

[128] It's the loss function.

[129] So you wanna say the practice of machine learning is different than the practice of statistics.

[130] Like the things you have to worry about and how you worry about them are different, therefore they're different.

[131] Right.

[132] at a very little.

[133] I mean, at the very least, that that much is true.

[134] It doesn't mean that statistics, computational, or otherwise aren't important.

[135] I think they are.

[136] I mean, I do a lot of that, for example.

[137] But I think it goes beyond that.

[138] I think that we could think about game theory in terms of statistics, but I don't think it's very as useful to do.

[139] I mean, the way I would think about it, or a way I would think about it is this way.

[140] Chemistry is just physics.

[141] But I don't think it's as useful to think about chemistry as being just physics.

[142] It's useful to think about it as chemistry, the level of abstraction really matters here.

[143] So I think it is, there are contexts in which it is useful.

[144] Yes.

[145] So finding that connection is actually helpful.

[146] And I think that's when I emphasize the computational statistics thing.

[147] I think I want to befriend statistics and not absorb them.

[148] Here's the A way to think about it beyond what I just said, right?

[149] So what would you say, and I want you to think back to a conversation we had a very long time ago, What would you say is a difference between, say, the early 2000s, ICML and what we used to call NIPS, NERPs?

[150] Is there a difference?

[151] A lot of the, particularly on the machine learning that was done there?

[152] ICMO is around that long?

[153] Oh, yeah.

[154] So I clear as the new conference, newish.

[155] Yeah, I guess so.

[156] And ICML is around the 2000?

[157] Oh, ICML predates that.

[158] I think my most cited ICML paper is from 94.

[159] Yeah.

[160] Michael knows this better than me because, of course, he's significantly older than I. But the point is.

[161] Yeah.

[162] What is the difference between ICML and Nureps in the late 90s, early 2000s?

[163] I don't know what everyone else's perspective would be, but I had a particular perspective at that time, which is I felt like ICML was more of a computer science place, and that NNIP's NERIPS was more of an engineering place, like the kind of math that happened at the two places.

[164] As a computer scientist, I felt more comfortable with the ICML math, and the NURPS people would say that that's because I'm dumb.

[165] and that's such an engineering thing to say.

[166] I agree with that part, but I do it a little differently.

[167] We actually had a nice conversation with Tom Dietrich about this on public.

[168] On Twitter, just a couple of days ago.

[169] I put it a little differently, which is that ICML was machine learning done by computer scientists.

[170] And NERPS was machine learning done by computer scientists trying to impress statisticians.

[171] Which was weird because it was the same people, at least by the time I started paying attention.

[172] But it just felt very, very different.

[173] And I think that that perspective of whether you're trying to impress the statisticians or you're trying to impress the programmers is actually very different and has real impact on what you choose to worry about and what kind of outcomes you come to.

[174] So I think it really matters.

[175] I think computational statistics is a means to an end.

[176] It is not an end in some sense.

[177] And I think that really matters here in the same way that I don't think computer science is just engineering or just science or just math or whatever.

[178] Okay, so I'd have to now agree that now we agree on everything.

[179] Yes, yes.

[180] The important thing here is that, you know, my opinions may have changed, but not the fact that I'm right, I think, is what we just came to.

[181] Right.

[182] And my opinions may have changed and not the fact that I'm wrong.

[183] That's right.

[184] I lost me. I'm not even...

[185] I think I lost myself there, too.

[186] But anyway, but we're back.

[187] We're back.

[188] This happens to us sometimes.

[189] We're sorry.

[190] How does neural networks change this, just to even linger on this topic, change this idea of statistics, how big, of a pie statistics is within the machine learning thing.

[191] Because it sounds like hyperparameters and also just the role of data.

[192] You know, people are starting to use the terminology of software 2 .0, which is like the act of programming as a, like you're a designer in the hyperparameter space of neural networks and you're also the collector and the organizer and the cleaner of the data.

[193] And that's part of the programming.

[194] So how did, on the Neurips versus ICML topic, what's the role of neural networks in redefining the size and the role of machine learning?

[195] I can't wait to hear what Michael thinks about this, but I would add one.

[196] But you will.

[197] That's true.

[198] I will force myself to.

[199] I think there's one thing I would add to your description, which is the kind of software engineering part is what does it mean to debug, for example.

[200] But this is a difference between the kind of computational statistics view of machine learning and the computational view of machine learning, which is, I think one is worried about the equation, as it were.

[201] And by the way, this is not a value judgment.

[202] I just think it's about perspective.

[203] But the kind of questions you would ask, when you start asking yourself, what does it mean to program and develop and build the system is a very computer -sci view of the problem.

[204] I mean, if you get on data science, Twitter and econ Twitter, you actually hear this a lot with the economist and the data scientists complaining about the machine learning people.

[205] well, it's just statistics, and I don't know why they don't see this, but they're not even asking the same questions.

[206] They're not thinking about it as a kind of programming problem, and I think that that really matters, just asking this question.

[207] I actually think it's a little different from programming in hyperparameter space and sort of collecting the data, but I do think that that immersion really matters.

[208] So I'll give you a quick example the way I think about this.

[209] So I teach machine learning.

[210] Michael and I have co -taught a machine learning class, which has now reached, I don't know, 10 ,000 people at least.

[211] or the last several years or somewhere there's about.

[212] And my machine learning assignments are of this form.

[213] So the first one is something like implement these five algorithms, you know, K &N and SVMs and boosting and decision trees and neural networks.

[214] And maybe that's it.

[215] I can't remember.

[216] And when I say implement, I mean steal the code.

[217] I am completely uninterested.

[218] You get zero points for getting the thing to work.

[219] I don't want you spending your time worrying about getting the corner case right of, you know, what happens when you are trying to normalize distances and the points on the thing and so you divide by zero.

[220] I'm not interested in that, right?

[221] Steal the code.

[222] However, you're going to run those algorithms on two datasets.

[223] The data sets have to be interesting.

[224] What does it mean to be interesting?

[225] Well, a data sets interesting if it reveals differences between algorithms, which presumably are all the same because they can represent whatever they can represent.

[226] And two data sets are interesting together if they show different differences, as it were.

[227] And you have to analyze them.

[228] You have to justify their interestingness and you have to analyze them a whole bunch of ways.

[229] But all I care about is the data in your analysis, not the programming.

[230] And I occasionally end up in these long discussions with students.

[231] Well, I don't really, I copy and paste the things that I've said the other 15 ,000 times it's come up, which is, they go, but the only way to learn really understand is to code them up, which is a very programmer, software engineering view of the world.

[232] If you don't program it, you don't understand it.

[233] Which is, by the way, I think is wrong in a very specific way, but it is a way that you come to understand because then you have to wrestle with the algorithm.

[234] But the thing about machine learning is it's not just sorting numbers where in some sense the data doesn't matter.

[235] What matters is, well, does algorithm work on these abstract things and one to the list of the other?

[236] In machine learning, the data matters.

[237] It matters more than almost anything.

[238] And not everything, but almost anything.

[239] And so as a result, you have to live with the data and don't get distracted by the algorithm per se.

[240] And I think that that focus on the data and what it can tell you and what question it's actually answering for you as opposed to the question you thought you were asking is a key and important thing about machine learning and is a way that computationalists as opposed to statisticians bring a particular view about how to think about the process.

[241] The statisticians, by contrast, bring, I think I'd be willing to say, a better view about the kind of formal math that's behind it and what an actual number ultimately is saying about the data.

[242] And those are both important, but they're also different.

[243] I didn't really think of it this way, is to build intuition about the role of data, the different characteristics of data, by having two data sets that are different and they reveal the differences in the differences.

[244] That's a really fascinating.

[245] That's a really interesting educational approach.

[246] The students love it, but not right away.

[247] No, they love it at the end.

[248] They love it at the end.

[249] Not at the beginning.

[250] Not even immediately after.

[251] I feel like there's a deep, profound lesson about education there, that you can't listen to students about whether what you're doing is the right or the wrong thing.

[252] Well, as a wise, Michael Lipman once said to me, about children, which I think applies to teaching, is you have to give them what they need without bending to their will.

[253] And students are like that.

[254] You have to figure out what they need.

[255] You're a curator.

[256] Your whole job is to curate and to present because on their own, they're not going to necessarily know where to search.

[257] So you're providing pushes in some direction and learn space.

[258] And you have to give them what they need in a way that keeps them engaged enough so that they eventually discover what they want and they get the tools they need to go and learn other things off.

[259] What's your view?

[260] Let me put on my Russian hat, which believes that life is suffering.

[261] I like Russian hats, by the way.

[262] If you have one, I would like this.

[263] Those are ridiculous.

[264] Yes.

[265] But in a delightful way.

[266] But sure.

[267] What do you think is the role of, we talked about balance a little bit, what do you think is the role of hardship and education?

[268] Like, I think the biggest things I've learned, like what made me fall in love with math, for example, is by being bad at it until I got good at it.

[269] So, like, struggling with a problem, which increased the level of joy I felt when I finally, figured it out.

[270] And it always felt with me, with teachers, especially modern discussions of education, how can we make education more fun, more engaging, more all those things?

[271] Or from my perspective, it's like you're maybe missing the point that education, that life is suffering.

[272] Education is supposed to be hard and that actually what increases the joy you feel when you actually learn something.

[273] Is that ridiculous?

[274] Do you like to see your students suffer?

[275] Okay, so this may be a point where we differ.

[276] I suspect not.

[277] Okay.

[278] Well, what would your answer be?

[279] I want to hear you first.

[280] Okay, well, I was going to not answer the question.

[281] So you don't want the students to know you enjoy them suffering?

[282] No, no, no, no, no. I was going to say that there's, I think there's a distinction that you can make in the kind of suffering, right?

[283] So I think you can be in a mode where you're suffering in a hopeless way versus you're suffering in a hopeful way right where you're like you can see that if you that you still have you can still imagine getting to the end right and as long as people are in that mindset where they're struggling but it's not a hopeless kind of struggling that's that's productive i think that's really helpful but it's struggling like if you you break their will if you leave them hopeless no that don't sure some people are going to whatever lift themselves up by their bootstraps but like mostly you give up and certainly it takes the joy out of it and you're not going to spend a lot of time on something that brings you no joy.

[284] So it's a, it is a bit of a delicate balance, right?

[285] You have to thwart people in a way that they still believe that there's a way through.

[286] Right.

[287] So that's a, that we strongly agree actually.

[288] So I think, well, first off, struggling and suffering aren't the same thing, right?

[289] Just being poetic.

[290] Oh, no, no, I actually appreciate the poetry.

[291] And I, One of the reasons I appreciate it is that they are often the same thing and often quite different, right?

[292] So you can struggle without suffering.

[293] You can certainly suffer pretty easily.

[294] You don't necessarily have to struggle to suffer.

[295] So I think that you want people to struggle, but that hope matters.

[296] You have to understand that they're going to get through it on the other side.

[297] And it's very easy to confuse the two.

[298] I actually think Brown University has a very, just philosophically has a very different take on the relationship with their students, particularly undergrads from, say, a place like Georgia Tech, which is...

[299] Which universities better?

[300] Well, I have my opinions on that.

[301] I mean, remember, Charles said, it doesn't matter what the facts are.

[302] I'm always right.

[303] The correct answer is that it doesn't matter.

[304] They're different.

[305] But clearly answers to that.

[306] He went to a school like the school where he is as an undergrad.

[307] I went to a school, specifically the same school, though it was changed a bit in the intervening years.

[308] Brown or Georgia Tech?

[309] No, I was talking about Georgia Tech.

[310] And I went...

[311] Yeah, and I went to an undergrad place that's a lot like the place where I work now, and so it does seem like we're more familiar with these models.

[312] There's a similarity between Brown and Yale?

[313] Yeah, I think they're quite similar, yeah.

[314] And Duke.

[315] Duke has some similarities, too, but it's got a little southern drawl.

[316] You've kind of worked, you've sort of worked at universities that are like the places where you learned, and the same would be true for me. Are you uncomfortable venturing outside the box?

[317] Is that what you're saying, journeying out?

[318] Not what I'm saying.

[319] Yeah, Charles is definitely, he only goes to places that have institute in the name, right?

[320] It has worked out that way.

[321] Well, no, I was a visiting scientist at UPenn, or visiting something at UPenn.

[322] Oh, wow, I just understood your joke.

[323] Which one?

[324] Five minutes later.

[325] I like to set these sort of time bombs.

[326] The institute is in the, uh, that Charles only goes to places that have institute in the name.

[327] So I guess Georgia, I forget that Georgia Tech is Georgia Institute of Technology.

[328] The number of people who referred to it as Georgia Tech University is large and incredibly irritating.

[329] It's one of the few things that genuinely gets under my skin.

[330] But like schools like Georgia Tech and MIT have as part of the ethos.

[331] Like there is, I want to say there's an abbreviation that someone taught me, like, I -T -FP, something like that.

[332] Like there's an expression, which is basically, I hate being here, which they say so proudly.

[333] And that is definitely not the ethos at Brown.

[334] Like, Brown is, there's a little more pampering and empowerment and stuff.

[335] And it's not like, we're going to crush you and you're going to love it.

[336] So, yeah, I think there's a, I think the ethoses are different.

[337] That's interesting, yeah.

[338] We had drown proof.

[339] What's that?

[340] In order to graduate from Georgia Tech, this is a true thing.

[341] Feel free to look it up.

[342] A lot of schools have this, by the way.

[343] No. Georgia Tech was barely the first.

[344] Brandeis has it.

[345] Had it.

[346] I feel like Georgia Tech was the first.

[347] In a lot of ways.

[348] It was the first in a lot of things.

[349] Had the first master's degree in...

[350] Stop that.

[351] First master's in computer science, actually.

[352] Right, online masters.

[353] Well, that too, but way back in the 60s.

[354] NSF grant.

[355] Yeah, yeah.

[356] You had the first information and computer science master's degree in the country.

[357] But the Georgia Tech, it used to be the case in order to graduate from Georgia Tech.

[358] You had to take a drown -proofing class, where effectively, they threw you in water, tied you up.

[359] If you didn't drown, you got to graduate.

[360] Hide you up?

[361] I believe so.

[362] No. There were certainly versions of it, but I mean, luckily, they ended it just before I had to graduate because otherwise we'd have never graduated.

[363] It was going to happen.

[364] I want to say 84, 83, someone around then.

[365] They ended it, but yeah, you used to have to prove you could tread water for some ridiculous amount of time.

[366] Two minutes.

[367] You couldn't graduate.

[368] No, it was more than two minutes.

[369] Okay, well, we'll look.

[370] And it was in a bathtub.

[371] It was in a pool.

[372] But it was a real thing.

[373] But that idea that, you know, push you.

[374] Fully clothed.

[375] Yeah, fully clothed.

[376] I bet it was that and not tied up.

[377] Because, like, who needs to learn how to swim when you're tied?

[378] Nobody, but who needs to learn to swim when you're actually falling into the water dressed?

[379] That's a real thing.

[380] I think your facts are getting in the way with a good story.

[381] Oh, that's fair.

[382] That's fair.

[383] I didn't agree with it.

[384] All right.

[385] So they tie you up.

[386] The narrative matters.

[387] But whatever it was, you had to, it was called drownproofing for a reason.

[388] The point of the story, Michael, is that it's.

[389] Struggle.

[390] Well, no, but that's good.

[391] It doesn't bring it back to struggle.

[392] That's a part of what Georgia Tech has.

[393] has always been, and we struggle with that, by the way, about what we want to be, particularly as things go.

[394] But you sort of, how much can you be pushed without breaking?

[395] And you come out of the other end stronger, right?

[396] There's this saying we used to have when I was an undergrad there, which was Georgia Tech, building tomorrow the night before.

[397] Right?

[398] And this is just kind of idea that, you know, give me something impossible to do, and I'll do it in a couple of days, because that's what I just spent the last four or five or six.

[399] That ethos definitely stuck to you, having now done a number of projects with you, you definitely will do it the night before.

[400] That's not entirely true.

[401] There's nothing wrong with waiting until the last minute.

[402] The secret is knowing when the last minute is.

[403] Right.

[404] That's brilliantly put.

[405] Yeah.

[406] That is a definite Charles statement that I am trying not to embrace.

[407] Wow.

[408] And I appreciate that because you helped move my last minute.

[409] That's a social construct where you converge together what the definition of last minute is.

[410] And we figure that out all together.

[411] In fact, MIT, you know, I'm sure a lot of universities have this, but MIT has like MIT time that everyone has always agreed together that there is such a concept and everyone just keeps showing up like 10 to 15 to 20, depending on the department late to everything.

[412] So there's like a weird drift that happens.

[413] It's kind of fascinating.

[414] Yeah, we're five minutes.

[415] In fact, the classes will say, you know, well, this is no longer true, actually.

[416] But it used to be a class would start at eight, but actually it started to eight to five.

[417] It ends at nine.

[418] Actually, it ends at $8 .55.

[419] Everything's five minutes off, and nobody expects anything to start until five minutes after the half hour, whatever it is.

[420] It still exists.

[421] It hurts my head.

[422] Well, let's rewind the clock back to the 50s and 60s when you guys met.

[423] How did you?

[424] I'm just kidding.

[425] I don't know.

[426] But can you tell the story of how you met?

[427] So the internet and the world kind of knows you as connected in some ways.

[428] in terms of education of teaching the world.

[429] That's like the public -facing thing, but how did you as human beings and as collaborators meet?

[430] I think there's two stories.

[431] One is how we met, and the other is how we got to know each other.

[432] I'm not going to say fall in love.

[433] I'm not going to say that we came to understand that we...

[434] Had some common something.

[435] Yeah, it's funny, because on the surface, I think we're different in a lot of ways, but there's something that just consonant.

[436] There you go.

[437] Afternoons.

[438] So I will tell the story of how we met, and I'll let Michael tell the story of how we met.

[439] Okay, all right.

[440] Okay, so here's how we met.

[441] I was already at that point, it was AT &T Labs.

[442] There's a long, interesting story there.

[443] But anyway, I was there, and Michael was coming to interview.

[444] He was a professor at Duke at the time, but decided for reasons that he wanted to be in New Jersey.

[445] And so that would mean Bell Labs slash AT &T Labs.

[446] And we were doing the interview.

[447] Interviews were very much like academic interviews.

[448] And so I had to be there.

[449] We all had to meet with him afterwards and so on one -on -one.

[450] But it was obvious to me that he was going to be hired.

[451] Like, no matter what, because everyone loved him.

[452] They were just talking about all the great stuff he did.

[453] Oh, he did this great thing.

[454] And you had just won something at AAA, I think.

[455] Or maybe you got 18 papers in AAA that year.

[456] I got the best paper award at AAA for the crossword stuff.

[457] Right, exactly.

[458] So that had all happened and everyone was going on and on and on about it.

[459] Actually, Sotendr was saying incredibly nice things about you.

[460] Really?

[461] Yes.

[462] He can be very grumpy.

[463] That's nice to hear.

[464] He was grumpily saying very nice thing.

[465] Oh, that makes sense.

[466] Yeah, it does make sense.

[467] So, you know, so it was going to come.

[468] So why were we, why was I meeting him?

[469] I had something else I had to do.

[470] I can't remember what it was.

[471] Probably involved comic.

[472] So he remembers meeting me as inconveniencing his afternoon.

[473] So he came, so eventually came to my office.

[474] I was in the middle trying to do something.

[475] I can't remember what.

[476] And he came and he sat down.

[477] And for reasons that are purely accidental, despite what Michael thinks, my desk at the time was set up in such a way that had sort of an L shape.

[478] and the chair on the outside was always lower than the chair that I was in and, you know, the kind of point was to...

[479] The only reason I think that it was on purpose is because you told me it was on purpose.

[480] I don't remember that.

[481] Anyway, the thing is that, you know, his guest chair was really low so that he could look down at everybody.

[482] The idea was just to simply create a nice environment that you were asking for a mortgage and I was going to say no. That was a very simple idea here.

[483] Anyway, so we sat there and we just talked for a little while and I think he got the impression that I didn't like him.

[484] It wasn't true.

[485] Strongly got that impression.

[486] The talk was really good.

[487] The talk, by the way, was terrible.

[488] And right after the talk, I said to my host, Michael Curns, who ultimately was my boss.

[489] I'm a huge fan and a huge fan of Michael, yeah.

[490] Yeah, he is a remarkable person.

[491] After my talk, I went into the...

[492] Racquetball.

[493] He's good at everything.

[494] No, basketball.

[495] No, but basketball, racquetball, too.

[496] Squash.

[497] Squash.

[498] Squash.

[499] Squash, not racquetball.

[500] Yeah, squash, which is not.

[501] Racquetball, yes, squash, no. And I hope you hear that, Michael.

[502] You mean in terms of as a game, not his skill level, because I'm pretty sure he's...

[503] All right, there's some competitiveness there.

[504] But the point is that it was like the middle of the day.

[505] I had full day of interviews.

[506] I got met with people.

[507] But then in the middle of the day, I gave a job talk.

[508] And then there was going to be more interviews.

[509] But I pulled Michael aside and I said, I think it's in both of our best interest if I just leave now.

[510] Because that was so bad that it's just be embarrassing if I have to talk to any more.

[511] people like you look bad for having invited me like it's just let's just forget this ever happened so i don't think the talk went well that's one of the most michael litman set of sentences i think i've ever heard he did great or at least everyone knew he was great so maybe it didn't matter i was there i remember the talk and i remember him being very much the way i remember him now on any given week so it was good and we met and we talked about stuff he thinks i didn't like him because he was so grumpy must been the chair thing the chair thing and the low voice i think like He obviously...

[512] And that, like, slight, like, skeptical look.

[513] Yes.

[514] I have no idea what you're talking about.

[515] Well, I probably didn't have any idea what you were talking about.

[516] Anyway, I liked him.

[517] He asked me questions.

[518] I answered questions.

[519] I felt bad about myself.

[520] It was a normal day.

[521] And then he left.

[522] And then he left, and that's how you met.

[523] Can we take a...

[524] And then I got hired and I was in the group.

[525] Can we take a slight tangent on this topic of...

[526] It sounds like maybe you could speak to the bigger picture.

[527] It sounds like you're quite self -critical.

[528] Who, Charles?

[529] No, you.

[530] Oh, I think I can do better.

[531] I can do better.

[532] I'll try me again.

[533] I'll do better.

[534] I'll be self -critical.

[535] I won't.

[536] I won't.

[537] Yeah, that was like a three out of ten response.

[538] Let's try to work it up to five and six.

[539] You know, I remember Marvin Minsky said on a video interview, something that the key to success in academic research is to hate everything you do.

[540] Oh.

[541] for some reason.

[542] I think I followed that because I hate everything he's done.

[543] That's a good line.

[544] That's a six.

[545] Maybe that's a keeper.

[546] But do you find that resonates with you at all in how you think about talks and so on?

[547] I would say it differently.

[548] It's not that.

[549] No, not really.

[550] That's such an MIT view of the world.

[551] So I remember talking about this when as a student, you know, you were basically told I will clean it up for the purposes of the podcast.

[552] My work is crap, my work is crap, my work is crap, my work is crap.

[553] Then you like go to a conference or something.

[554] You're like, everybody else's work is crap.

[555] Everybody else is work is crap.

[556] And you feel better and better about it, relatively speaking.

[557] And then you sort of keep working on it.

[558] I don't hate my work.

[559] That resonates with me. Yes.

[560] I've never hated my work, but I have been dissatisfied with it.

[561] And I think being dissatisfied, being okay with the fact that you've taken a positive step, the derivative's positive, maybe even the second derivative is positive.

[562] That's important because that's a part of the hope, right?

[563] But you have to, but I haven't gotten there yet.

[564] If that's not there that I haven't gotten there yet, then, you know, it's hard to, it's hard to move forward, I think.

[565] So I buy that, which is a little different from hating everything that you do.

[566] Yeah.

[567] I mean, there's things that I've done that I like better than I like myself.

[568] So it's separating me from the work, essentially.

[569] So I think I am very critical of myself, but sometimes the work I'm really excited about.

[570] And sometimes I think it's kind of good.

[571] Does that happen right away?

[572] So I found the work that I've liked that I've done.

[573] Most of it, I liked it in retrospect more when I was far away from it in time.

[574] I have to be fairly excited about it to get done.

[575] No, excited at the time, but then happy with the result.

[576] But years later, or even, I might go back, you know what, that actually turned out the matter.

[577] That turned out the matter.

[578] Or, oh, gosh, it turns out I've been thinking about that.

[579] It's actually influenced all the work that I've done since without realizing it.

[580] Boy, that guy was smart.

[581] Yeah, that guy had a future.

[582] Yeah, I, yeah.

[583] He's going places.

[584] I think there's, so yeah, so I think there's something to it.

[585] I think there's something to the idea we've got to, you know, hate what you do, but it's not quite hate.

[586] It's just being unsatisfied.

[587] And different people motivate themselves differently.

[588] I don't happen to motivate myself with self -loathing.

[589] I happen to motivate myself with something else.

[590] So you're able to sit back and be proud of, in retrospect of the work you've done?

[591] Well, and it's easier when you can connect it with other people, because then you can be proud of them.

[592] A lot of the people, yeah.

[593] And then the question.

[594] to still safely hate yourself.

[595] Yeah, that's right.

[596] It's win, win, Michael, or at least win, lose, which is what you're looking for.

[597] Oh, wow.

[598] There's so many brilliant lines in this.

[599] There's levels.

[600] So how did you actually meet meat?

[601] Yeah, Michael.

[602] So the way I think about it is, because we didn't do much research together at AT &T, but then we all got laid off.

[603] So that was, that's sucked.

[604] By the way, sorry to interrupt, but that was like one of the most magical places, historically speaking of they did not appreciate what they had and how do we uh i feel like there's a profound lesson in there too uh how do we get it like what was why was it so magical is just the coincidence of history or is there something special there were some really good managers and people who really believed in machine learning as this is going to be important um let's get the the people who are thinking about this in creative and and insightful ways and put them in one place and stir Yeah, but even beyond that, right, it was, it was Bell Labs at its heyday, and even when we were there, which I think was past its head.

[605] And to be clear, he's gotten to be at Bell Labs.

[606] I never got to be at Bell Labs.

[607] I joined after that.

[608] Yeah, I showed up in 91 as a grad student.

[609] So I was there for a long time every summer, except for twice I worked for companies that had just stopped being Bell Labs.

[610] Right, Bell Corp and then AT &T Labs.

[611] So Bell Labs was several locations or for the research or is it one?

[612] Like, is that like Jersey's?

[613] Oh, yeah.

[614] they're all in Jersey yeah they're all over the place but they were in a couple places in Murray Hill was the Bell Labs place so you you had you had an office at Mary Hill at one point in your career yeah I played Ultimate Frisbee on the cricket pitch at Bell Labs at Murray Hill and then it became AT &T Labs when split off with loose during what we called Trivestiture you better than Michael Cairns at Ultimate Frisbee yeah oh yeah okay but I think that one's not boasting I think that I think Charles plays a lot of Ultimate and I don't think Mike No, I was, yes, but that wasn't the point.

[615] The point is yes.

[616] I'm finally better.

[617] Oh, yes, sorry.

[618] Okay, I have played on a championship winning ultimate frisbee team or whatever, ultimate team with Charles.

[619] So I know how good he is.

[620] He's really good.

[621] How good I was anyway when I was younger.

[622] But the thing is.

[623] I know how young he was when he was younger.

[624] That's true.

[625] That is true.

[626] So much younger than now you have.

[627] He's older now.

[628] Yeah, Michael is a much better basketball player than I was.

[629] Michael Ernst.

[630] Yes, no, not Michael.

[631] Let's be very clear.

[632] To be clear, I've not played basketball with you.

[633] So you don't know how terrible I am, but you have a probably pretty good guess.

[634] That you're not as good as Michael Kern.

[635] He's tall and athletic.

[636] And he cared about it.

[637] He's very athletic, very good.

[638] And probably competitive.

[639] I love hanging out with Michael.

[640] Anyway, but we were talking about something else, although I no longer remember what it was.

[641] What were we talking about?

[642] Oh, Bell Labs.

[643] But also Labs.

[644] So this was kind of cool about what was magical about it.

[645] The first thing you have to know is that Bell Labs was an arm of the government, right?

[646] Because 18T was an arm of a government.

[647] It was a monopoly.

[648] And, you know, every month you paid a, a little thing on your phone bill, which turned out was a tax for like all the research that Bell Labs was doing.

[649] And, you know, they invented transistors and the laser and whatever else is that they did.

[650] The big bang or whatever the, the cosmic background radiation.

[651] Yeah, they did all that stuff.

[652] They had some amazing stuff with directional microphones, by the way.

[653] I got to go in this room where they had all these panels and everything, and we would talk and one another, and he moved some panels around, and then he'd have me step, two steps to the left, and I couldn't hear a thing he was saying because nothing was bouncing off the walls.

[654] And then he would shut it all down, and you could hear your heartbeat, which is deeply disturbing to hear your heartbeat.

[655] You can feel it.

[656] I mean, you can feel it now.

[657] There's just so much all this sort of noise around.

[658] Anyway, Bell Labs is about pure research.

[659] It was a university, in some sense, the purest sense of a university, but without students.

[660] So it was all the faculty working with one another, and students would come in to learn.

[661] They would come in for three or four months during the summer, and they would go away.

[662] But it was just this kind of wonderful experience that I could walk out my door.

[663] In fact, I would often have to walk out my door and deal with Rich Sutton and Michael Kearns yelling at each other about whatever it is they were yelling about the proper way to prove something or another.

[664] And I could just do that and Dave McAllister and Peter Stone and all of these other people, including Sotendor and eventually Michael.

[665] And it was just a place where you could think, thoughts.

[666] And it was okay because so long as once every 25 years or so, somebody invented a transistor, it paid for everything else.

[667] You could afford to take the risk.

[668] And then when that all went away, it became harder and harder and harder to justify it as far as the folks who were very far away were concerned.

[669] And there was such a fast turnaround among mental management on the AT &T side that you never had a chance to really build a relationship.

[670] At least people like us didn't have a chance to build a relationship.

[671] So when the diaspora happened, it was amazing, right?

[672] Everybody left, and I think everybody ended up at a great place and made a huge, continue to do really good work with machine learning.

[673] But it was a wonderful place.

[674] And people will ask me, you know, what's the best job you, you've ever had.

[675] And as a professor, the answer that I would give is, well, probably Bell Labs in some very real sense, and I will never have a job like that again because Bell Labs doesn't exist anymore.

[676] And, you know, Microsoft Research is great, and Google does good stuff, and you can pick IBM, you can tell if you want to, but Bell Labs was magical.

[677] It was around for, it was an important time, and it represents a high watermark in basic research in the U .S. Is there something you could say about the physical proximity and the chance of collisions like we live in this time of the pandemic where everyone is maybe trying to see the silver lining and accepting the remote nature of things is there one of the things that people like faculty that I talk to miss is the the procrastination like the chance to make everything is about meetings that are supposed to be there's not a chance to just you know talk about comic book or whatever, like go into discussion that's totally pointless.

[678] So it's funny you say this because that's how we met, met.

[679] It's exactly that.

[680] So I'll let Michael say that, but I'll just add one thing, which is just that, you know, research is a social process.

[681] And it helps to have random social interactions, even if they don't feel social at the time.

[682] That's how you get things done.

[683] One of the great things about the AI lab when I was there, I don't quite know what it looks like now once they move buildings, but we had entire walls that were whiteboards.

[684] And people would just get up there and they were just right and people would walk up and you'd have arguments and you'd explain things to one another.

[685] And you got so much out of the freedom to do that.

[686] You had to be okay with people challenging every frickin' words you said, which I would sometimes find deeply irritating.

[687] But most of the time, it was quite useful.

[688] But the sort of pointlessness and the interaction was, in some sense, the point, at least for me. Yeah, I mean, you, I think offline yesterday I mentioned Josh Tenenbaum, and he's very much, he's, oh, man, he's such an inspiration in the, childlike way that he pulls you in on any topic.

[689] He doesn't even have to be about machine learning or the brain.

[690] He'll just pull you into a closest writable surface, which is still, you can find whiteboards in MIT everywhere.

[691] And just like basically cancel all meetings and talk for a couple hours about some aimless thing in it.

[692] It feels like the whole world, the time, space continuum kind of warps, and that becomes the most important thing.

[693] And then it's just, it's so true.

[694] It's, it's made, it's definitely something worth missing in this, in this world where everything's remote.

[695] There's some magic to the physical presence.

[696] Whenever I wonder myself whether MIT really is as great as I remember it, I just go talk to Josh.

[697] Yeah.

[698] You know, that's funny.

[699] There's a few people in this world that carry the, the best of what particular institutions stand for, right?

[700] And there's, uh.

[701] It's Josh.

[702] I mean, I don't, I, my guess is he's unaware of this.

[703] That's the point.

[704] Yeah.

[705] that the masters are not aware of their mastery.

[706] So, how did you meet?

[707] Yes, but first a tangent, no. How did you meet me?

[708] So I'm not sure what you were thinking of, but when it started to dawn on me that maybe we had a longer -term bond was after we all got laid off.

[709] And you had decided at that point that we were still paid.

[710] We were given an opportunity to do job search and kind of make a transition.

[711] but it was clear that we were done and I would go to my office to work and you would go to my office to keep me from working.

[712] That was my recollection of it.

[713] You had decided that there was really no point in working for the company because our relationship with the company was done.

[714] Yeah, but remember, I felt that way beforehand.

[715] It wasn't about the company.

[716] It was about the set of people there doing really cool things and it had always been that way.

[717] But we were working on something together.

[718] Oh, yeah, yeah, yeah, that's right.

[719] So at the very end, we all got laid off, but then our boss came to, boss's boss came to us because our boss was Michael Cairns and he had jumped ship brilliantly like perfect timing like things like right before the ship was about to sink he was like got to go and and landed perfectly because Michael Cairns because Michael Cairns and leaving the rest of us to go like this is fine and then it was clear that it wasn't fine and we were all toast so we had this sort of long period of time but then our boss figured out okay wait maybe we can save a couple of these people, if we can have them do something really useful.

[720] And the useful thing was we were going to make basically an automated assistant that could help you with your calendar.

[721] You could like tell it things and it would respond appropriately.

[722] It would just kind of integrate across all sorts of your personal information.

[723] And so me and Charles and Peter Stone were set up as the crack team to actually solve this problem.

[724] other people maybe were too theoretical that they thought and and but we could actually get something done so we sat down to get something done and there wasn't time and it wouldn't have saved us anyway and so it all kind of went downhill but the interesting i think coda to that is that our boss's boss is a guy named ron brockman and he when he left AT &T because we were all laid off he went to DARPA started up a program there that became Kalo which is the program for from which Siri sprung, which is a digital assistant that helps you with your calendar and a bunch of other things, it really, you know, in some ways got it start with me and Charles and Peter trying to implement this vision that Ron Brockman had that he ultimately got implemented through his role at DARPA.

[725] So when I'm trying to feel less bad about having been laid off from what is possibly the greatest job of all time, I think about, well, we kind of help birth Siri.

[726] something and he did other things too but the we got to spend a lot of time in his office and talk about we got to spend a lot of time in my office yeah yeah yeah and so so then we went on our merry way everyone went to different places Charles landed at Georgia Tech which was what he always dreamed he would do and so that worked out well I came up with a saying at the time which is luck favors the Charles it's kind of like luck favors the prepared But Charles, like, he'd wish something, and then it would basically happen just the way he wanted.

[727] It was inspirational to see things go that way.

[728] Things worked out.

[729] And we stayed in touch.

[730] And then I think it really helped when you were working on, I mean, you kept me in the loop for things like threads and the work that you were doing at Georgia Tech.

[731] But then when they were starting their online master's program, he knew that I was really excited about MOOCs and online teaching.

[732] And he's like, I have a plan.

[733] And I'm like, tell me your plan.

[734] And he's like, I can't tell you the plan yet.

[735] Because they were deep in negotiations between Georgia Tech and Udacity to make this happen.

[736] And they didn't want it to leak.

[737] So Charles would kept teasing me about it, but wouldn't tell me what was actually going on.

[738] And eventually it was announced.

[739] And he said, I would like you to teach the machine learning course with me. I'm like, that can't possibly work.

[740] But it was a great idea.

[741] And it was super fun.

[742] It was a lot of work to put together.

[743] But it was really great.

[744] Was that the first time you thought about, first of all, was it the first time you got seriously into teaching.

[745] I mean, you know, I was a professor.

[746] Oh, so this was already, this is already after you jumped to.

[747] So like, there's a little bit of jumping around in time.

[748] Yeah, sorry about that.

[749] There's a pretty big jump in time.

[750] So like the MOOCs thing.

[751] So Charles got to Georgia Tech and he, I mean, maybe Charles, maybe this is a Charles story.

[752] I got to Georgia Tech in 2002.

[753] And but then, and worked on things like revamping the curriculum, the undergraduate curriculum so that it had some kind of semblance of modular structure because computer science was at the time moving from a fairly narrow specific set of topics to touching a lot of other parts of intellectual life.

[754] And the curriculum was supposed to reflect that.

[755] And so Charles played a big role in kind of redesigning that.

[756] And for my, and for my, my labors, I ended up to his associate dean.

[757] Right.

[758] He got to become associate dean of in charge of educational stuff.

[759] This should be a valuable lesson if you're good at something.

[760] they will give you responsibility to do more of that thing.

[761] Well, until you...

[762] Don't show competence.

[763] Don't show competence if you don't want responsibility.

[764] Here's what they say.

[765] The reward for good work is more work.

[766] The reward for bad work is less work.

[767] Which, I don't know, depending on what you're trying to do that week, one of those is better than the other.

[768] Well, one of the problems with the word work, sorry, to interrupt, is that it seems to be an anonym in this particular.

[769] language would have the opposite of happiness, but it seems like they're, that's one of, you know, we talked about balance.

[770] It's, it's always like work -life balance.

[771] It always rubbed me the wrong way as a terminology.

[772] I know it's just words.

[773] Right.

[774] The opposite of work is play, but ideally work is play.

[775] Oh, I can't tell you how much time I'd spend, certainly on Zabel Labs, except for a few very key moments.

[776] As a professor, I would do this too.

[777] I would just I cannot believe they're paying me to do this.

[778] Because it's fun.

[779] It's something that I would do for a hobby if I could anyway.

[780] So that's what it worked out.

[781] You sure you want to be saying that when this is being recorded?

[782] As a dean, that is not true at all.

[783] I need to raise.

[784] But I think here with this, even though a lot of time passed, you know, Michael and I talked almost every, well, we texted, almost every day during the period.

[785] Charles at one point took me, there was the ICML conference, the machine learning conference was in Atlanta.

[786] I was the chair, the general chair of the conference.

[787] Charles was my publicity chair or something like that or fundraising chair?

[788] No, fundraising chair.

[789] Yeah.

[790] But he decided it would be really funny if he didn't actually show up for the conference in his own home city.

[791] So he didn't.

[792] But he did at one point pick me up at the conference in his Tesla and drove me to the Atlanta mall and forced me to buy an iPhone because he didn't like how.

[793] it was to text with me and thought it would be better for him if I had an iPhone, the text would be somehow smoother.

[794] And it was.

[795] And it was.

[796] And his life is better.

[797] And my life is better.

[798] And so, yeah, but it was, yeah, Charles forced me to get an iPhone so that he could text me more efficiently.

[799] I thought that was an interesting moment.

[800] It works for me. Anyway, so we kept talking the whole time and eventually we did the teaching thing.

[801] And it was great.

[802] And there's a couple of reasons for that, by the way.

[803] One is, I really wanted to do something different.

[804] Like, you've got this medium here.

[805] people claim it can change things.

[806] What's a thing that you could do in this medium that you could not do otherwise?

[807] Besides edit, right?

[808] I mean, what could you do?

[809] And being able to do something with another person was that kind of thing.

[810] It's very hard.

[811] I mean, you can take turns, but teaching together, having conversations is very hard, right?

[812] So that was a cool thing.

[813] The second thing, you give me an excuse to do more stuff with him.

[814] Yeah, I always thought, he makes it sound brilliant.

[815] And it is, I guess.

[816] But at the time, it really felt like I've got a lot to do, Charles is saying, and it would be great if Michael could teach the course and I could just hang out.

[817] Yeah, just kind of coast on that.

[818] Well, that's what the second class was more like that.

[819] Because the second class was explicit like that.

[820] But the first class, it was at least half.

[821] Yeah, but I knew all this stuff.

[822] I think you're once again letting the facts get in the way.

[823] A good story.

[824] Good story.

[825] I should just let Charles talk to us.

[826] But that's the facts as he saw.

[827] So that was kind of true.

[828] That's your facts.

[829] Yeah, that was sort of true for 7642, which was the reinforcement learning class because that was really his class.

[830] You started with reinforcement learning?

[831] No, we started with, I did the intro of machine learning, 7641, which is supervised learning, unsupervised learning, and reinforcement learning and cram all that in there, the kind of assignments that we talked about earlier.

[832] And then eventually, about a year later, we did a follow -on 7642, which is reinforcement learning and decision -making.

[833] The first class was based on something I had been teaching at that point for well over a decade, and the second class was based on something Michael had been teachers.

[834] Actually, I learned quite a bit teaching that class with him, but he drove most of that.

[835] But the first one, I drove most, it was all my material, although I had stolen that material originally from slides I found online from Michael, who had originally stolen that material from, I guess, slides he found online, probably from Andrew Moore, because the jokes were the same anyway.

[836] At least some of the, at least when I found the slides, some of the stuff was there.

[837] Yes, every machine learning class taught in early 2000 stole from Andrew Moore.

[838] A particular joke or two.

[839] At least the structure.

[840] Now, I did, and he did actually a lot more with reinforcement learning and such and game theory and those kinds of things.

[841] But, you know, we all sort of built.

[842] You mean in the research world.

[843] No, no, no. No, I mean in teaching that class.

[844] The coverage was different than what other people were doing.

[845] Most people were just doing supervised learning and maybe a little bit of, you know, clustering and whatnot.

[846] But we took it all the way to.

[847] A lot of it just comes from Tom Mitchell's book.

[848] Oh, no. Yeah, except, well, half of it comes from Tom Mitchell's book, right?

[849] The other half doesn't.

[850] This is why it's all readings, right?

[851] Because certain things weren't invented when Tom Robbins.

[852] Yeah, okay, that's true.

[853] Right.

[854] But it was quite good.

[855] But there's a reason for that besides, you know, just, I wanted to do it.

[856] I wanted to do something new and I wanted to do something with him, which is a realization, which is, despite what you might believe, he's an introvert and I'm an introvert, or I'm on the edge of being an introvert anyway.

[857] But both of us, I think, enjoy the energy of the crowd, right?

[858] There's something about talking to people and bringing them into whatever we find interesting that is empowering, energizing, or whatever.

[859] And I found the idea of staring alone at a computer screen and then talking off of materials less inspiring that I wanted it to be.

[860] And I had in fact done a MOOC for Udacity on algorithms, and it was a week in a dark room talking at the screen, writing on the little pad, and I didn't know this was happening, but they had watched the crew had watched some of the videos while in the middle of this, and they're like, something's wrong, you're sort of shutting down.

[861] And I think a lot of it was I'll make jokes and no one would laugh.

[862] And I felt like the crowd hated me. Now, of course, there was no crowd.

[863] So, like, it wasn't rational.

[864] But each time I tried it and I got no reaction, it just was taking the energy out of my performance, out of my presentation.

[865] Such a fantastic metaphor for grad school.

[866] Anyway, by working together, we could play off each other.

[867] and have it, and keep the energy up because you can't, you can't let your guard down for a moment with Charles.

[868] He'll just, he'll just overpower you.

[869] I have no idea what you're talking about.

[870] But we would work really well together.

[871] I thought, and we knew each other, so I knew that we could sort of make it work.

[872] Plus, I was the associate dean, so they had to do what I told him to do.

[873] We had to do that we had to make it work.

[874] And so it worked out very well, I thought, well enough that we...

[875] With great power comes great power.

[876] That's right.

[877] And we became smooth and curly, and that's when we did the overfitting thriller video.

[878] Yeah, yeah, yeah, that's a thing.

[879] So, okay, can we just like smooth and curly?

[880] So, okay, so it happened, it was completely spontaneous.

[881] These are nicknames to go by, or, it's what the students call us.

[882] He was, he was lecturing.

[883] So the way that we structure the lectures is one of us is the lecturer and one of us is basically the student.

[884] And so the, he was lecturing on.

[885] The lecture prepares all the materials, comes up with the quizzes, and then the student comes in not knowing anything.

[886] So it was, you know, just like being on campus.

[887] And I was doing game theory in particular, the prisoner's dilemma.

[888] And so he needed to set up a little.

[889] prisoner's dilemma grid.

[890] So he drew it and I could see what he was drawing.

[891] And the prisoner's dilemma consists of two players, two parties.

[892] So he decided he would make little cartoons of the two of us.

[893] And so there was two criminals, right, that were deciding whether or not to rat each other out.

[894] One of them he drew as a circle with a smiley face and a kind of goatee thing, smooth head.

[895] And the other one with all sorts of curly hair.

[896] And he said, this is smooth and curly.

[897] I said smooth and curly.

[898] He said, no, no, smooth with a V. It's very important that it have V. And then the students really, the students really took to that.

[899] Like, they've really, they found that relatable.

[900] He started singing Smooth Criminal by Michael Jackson.

[901] Yeah, yeah, yeah.

[902] And those names stuck.

[903] So we now have a video series, an episode, our kind of first actual episodes should be coming out today, smooth and curly on video, where the two of us discuss episodes of Westworld.

[904] We watch Westworld and we're like, huh what is this say about computer science and AI and we've never we did not watch it I mean I know it's on season three or whatever we have yeah as of this recording is on season three and it's now two episodes total yeah I think I watched three what do you think about Westworld two episodes in so I can tell you so far yeah I'm just guessing what's going to happen next it seems like bad things are going to happen with the robots uprising it's a lot of alert so I have not I have not I mean you know I vaguely remember a movie existing so I assume it's it's related to that but That was more my time than your time, Charles.

[905] That's right, because you're much older than I. I think the important thing here is that it's narrative, right?

[906] It's all about telling a story.

[907] That's the whole driving thing.

[908] But the idea that they would give these reveries, that they would make people, they would make them remember the awful things that happened.

[909] Who could possibly think that was a good?

[910] I got to, I mean, I don't know.

[911] I've only seen the first two episodes or maybe the third one.

[912] I think I've only seen the first one.

[913] You know what it was?

[914] You know what the problem is that the robots were actually designed by Hannibal Lecter.

[915] That's true.

[916] They were.

[917] So, like, what do you think is going to happen?

[918] Bad things.

[919] It's clear that things are happening and characters are being introduced and we don't yet know anything.

[920] But still, I was just struck by how it's all driven by narrative and story.

[921] And there's all these implied things like programming happen.

[922] The programming interface is talking to them about what's going on in their heads, which is both, I mean, artistically, it's probably useful to film it that way.

[923] But think about how it would work in real life.

[924] That just seems very crazy.

[925] But there was, we saw in the second episode, there's a screen.

[926] You could see things.

[927] They were wearing, like, Google Glass.

[928] It was quite interesting to just kind of ask this question so far.

[929] I mean, I assume it veers often to never, never land at some point.

[930] So we don't know.

[931] We can't answer that question.

[932] I'm also a fan of a guy named Alex Garland.

[933] He's a director of X. Machina.

[934] And he is the first.

[935] I wonder if Kubrick was like this, actually.

[936] Is he, like, studies what would it take to program and AI systems?

[937] Like, he's curious enough to go into that direction.

[938] on the Westwall side, I felt there was more emphasis on the narratives than, like, actually asking, like, computer science questions.

[939] Like, how would you build this?

[940] How would you?

[941] How would you debug it?

[942] To me, that's the key issue.

[943] They were terrible debuggers.

[944] Yeah.

[945] Well, they said specifically, so we make a change, and we put it out in the world, and that's bad because something terrible could happen.

[946] Like, if you're putting things out in the world and you're not sure whether something terrible is going to happen, your process is probably good.

[947] I just feel like there should have been someone whose sole job it was was to walk around and poke his headed and say, what could possibly go wrong, just over and over again.

[948] I would have loved if there was an, and I did watch a lot more, and I'm not giving anything away.

[949] I would have loved it if there was like an episode where like, like the new intern is like debugging a new model or something and like it just keeps failing.

[950] And they're like, all right.

[951] And then it's more turns into like an episode of Silicon Valley or something like that versus like this ominous AI systems that are constantly like threatening the fabric of this world that's been created.

[952] Yeah, and you know, the other, this reminds me of something that, so I agree with that that actually be very cool, at least, well, for the small percentage of people who care about debugging systems.

[953] But the other thing is, debugging, the series.

[954] Yeah, it falls into, think of the sequels, fear of the debugging.

[955] Oh, my gosh.

[956] Anyway, so.

[957] It's a nightmare show.

[958] It's a horror movie.

[959] I think that's where we lose people, by the way, early on is the people who either decide, either figure out debugging or think debugging is terrible.

[960] where we lose people in computer science this is part of the struggle versus suffering right you get through it and you kind of get the skills of it or you're just like this is dumb and this is a dumb way to do anything and I think that's when we lose people but well I'll leave it at that but I think that there's something really really neat about framing it that way but what I don't like about all of these all of these things and I love text mocking it by the way although the ending was very depressing One of the things I have to talk to Alex about, he says that the thing that nobody noticed, he put in, is at the end, spoiler alert, the robot turns and looks at the camera and smiles briefly.

[961] And to him, he thought that his definition of passing the touring, the general version of the touring test or the consciousness test, is.

[962] smiling for no one.

[963] Like, like, not, oh, you know, it's like the Chinese room kind of experiment.

[964] It's not always trying to act for others, but just on your own, being able to have a relationship with the actual experience and just, like, take it in.

[965] I don't know.

[966] He said, like, nobody noticed the magic of it.

[967] I have this vague feeling that I remember the smile, but now you just put the memory in my head, so probably not.

[968] But I do think that that's interesting, although by looking at it.

[969] at the camera, you are smiling for the audience, right?

[970] You're breaking the fourth wall.

[971] It seems, I mean, well, that's a limitation of the medium, but I like that idea.

[972] But here's the problem I have with all of those movies, all of them, is that, but I know why it's this way, and I enjoy those movies, and Westworld, is it sets up the problem of AI as succeeding and then having something we cannot control, but it's, that's not the bad part of AI.

[973] The bad part of AI is the stuff we're living through now, right?

[974] It's using the data to make decisions that are terrible.

[975] It's not the intelligence that's going to go out there and surpass us and, you know, take over the world or, you know, lock us into a room to starve to death slowly over multiple days.

[976] It's instead the tools that we're building that are allowing us to make the terrible decisions we would have less efficiently made before, right?

[977] You know, computers are very good at making us more efficient, including being more efficient at doing terrible things.

[978] And that's the part of the AI we have to worry about.

[979] It's not the, you know, true intelligence that we're going to build sometime in the future, probably long after we're around.

[980] But, you know, I just, I think that whole framing of it sort of misses the point, even though it is inspiring.

[981] And I was inspired by those ideas, right, that I got into this in part because I wanted to build something like that.

[982] Philosophical questions are interesting me, but, you know, that's not where the terror comes from.

[983] The terror comes from the every day.

[984] And you can construct the situation, it's in the subtlety of the interaction between A. and the human, like with social networks, all the stuff you're doing with interactive artificial intelligence.

[985] But, you know, I feel like Cal 9000 came a little bit closer to that in 2001 Space Odyssey, because it felt like a personal assistant.

[986] You know, it felt like closer to the AI systems we have today.

[987] And the real things we might actually encounter, which is over relying in some fundamental way on our like dumb assistants or on social networks like over offloading too much of us onto you know onto things that require internet and power and so on and thereby becoming powerless as a standalone entity and then when that thing starts to misbehave in some subtle way it creates a lot of problems.

[988] And those problems are dramatized when you're in space because you don't have a way to walk away.

[989] Well, as the man said, once we started making the decisions for you, it stopped being your world, right?

[990] That's the matrix, Michael, in case you don't.

[991] I didn't catch it, thank you.

[992] You don't remember.

[993] But on the other hand, I could say, no, because isn't that what we do with people anyway?

[994] You know, the shared intelligence that is humanity is relying on other people constantly.

[995] I mean, we hyper -specialize, right, as individuals.

[996] We're still generally intelligent.

[997] We make our own decisions in a lot of ways, but we leave most of this up to other people, and that's perfectly fine.

[998] And by the way, everyone doesn't necessarily share our goals.

[999] Sometimes they seem to be quite against us.

[1000] Sometimes we make decisions that others would see as against our own interests, and yet we somehow manage it, manage to survive.

[1001] I'm not entirely sure why an AI would actually make that worse, or even different, really.

[1002] You mentioned the Matrix?

[1003] Do you think we're living in a simulation?

[1004] It does feel like a thought game more than a real scientific question.

[1005] Well, I'll tell you why, like I think it's an interesting thought experiment to see what you think.

[1006] From a computer science perspective, it's a good experiment of how difficult would it be to create a sufficiently realistic world that us humans would enjoy being in?

[1007] That's almost like a competition.

[1008] If we're living in a simulation, then I don't believe that we were put in the simulation.

[1009] I believe that it's just physics playing out, and we came out of that.

[1010] Like, I don't, I don't, I don't think.

[1011] So you think you have to build the universe?

[1012] I think that the universe itself, we can think of that as a simulation.

[1013] And in fact, I try, sometimes I try to think about, to understand what it's like for a computer to start to think about the world.

[1014] I try to think about the world, things like quantum mechanics, where it doesn't feel very natural to me at all.

[1015] and it really strikes me as I don't understand this thing that we're living in it has there's weird things happening in it that don't feel natural to me at all now if you want to call that as the result of a simulator okay I'm fine with that but like I don't there's the bugs in the simulation there's the bugs I mean the interesting thing about simulation is that it might have bugs I mean that's the thing that I but there would be bugs for the people in the simulation they're just that's just reality unless you were aware enough to know that there was a bug But I think...

[1016] Back to the matrix.

[1017] Yeah, the way you put the question, though.

[1018] I don't think that we live in a simulation created for us.

[1019] Okay, I would say that.

[1020] I think that's interesting.

[1021] I've actually never thought about it that way.

[1022] I mean, the way you asked the question, though, is could you create a world that is enough for us humans?

[1023] It's an interestingly sort of self -referential question because the beings that created the simulation probably have not created a simulation that's realistic for them.

[1024] But we're in the simulation, and so it's realistic for us.

[1025] So we could create a simulation that is fine for the people in the simulation, as it were, that would not necessarily be fine for us as the creators of the simulation.

[1026] But, well, you can forget, I mean, when you go into the, if you play video games of virtual reality, you can, if it was some suspension of disbelief or whatever, it becomes the world.

[1027] It becomes the world, even like in brief moments, you forget that another world exists.

[1028] I mean, that's what, like, good stories do, they pull you in.

[1029] The question is, is it possible to pull, you know, our brains are limited.

[1030] Is it possible to pull the brain in to where we actually stay in that world longer and longer and longer and longer?

[1031] And like not only that, but we don't want to leave.

[1032] And so especially this is the key thing about the developing brain is if we journey into that world early on in life often.

[1033] How would you even know?

[1034] Yeah.

[1035] So I, but like from a video game design perspective from a Westworld perspective, it's, I think it's an important thing for even, computer scientists to think about because it's clear that video games are getting much better and virtual reality although it's been ups and down just like artificial intelligence it feels like virtual reality will be here in a very impressive form if we were to fast forward 100 years into the future in a way that might change society fundamentally like if I were to I'm very limited in predicting the future as all of us are but if I were to try to predict Like, in which way I'd be surprised to see the world 100 years from now, it'd be that, or impressed, it'd be that we're all no longer living in this physical world, that we're all living in a virtual world.

[1036] You really need to recalculating God by Sawyer.

[1037] He'll read it in a night.

[1038] It's a very easy read, but it's assuming you're that kind of reader.

[1039] But it's a good story, and it's kind of about this.

[1040] not in a way that it appears.

[1041] And I really enjoyed the thought experiment.

[1042] I think it's pretty sure it's Robert Sawyer.

[1043] But anyway, he's apparently Canadian's top science fiction writer, which is why the story mostly takes place in Toronto.

[1044] But it's a very good, it's a very good sort of story that sort of imagines this.

[1045] Very different kind of simulation hypothesis sort of thing from, say, the egg, for example.

[1046] You know, you know, I'm talking about the short story.

[1047] story.

[1048] By the guy who did The Martian.

[1049] Who wrote The Martian?

[1050] You know I'm talking.

[1051] The Martian.

[1052] Matt Heyman.

[1053] No. The book.

[1054] So we had this whole discussion that Michael doesn't partake in this exercise of reading.

[1055] He doesn't seem to like it, which seems very strange to me, considering how much he has to read.

[1056] I read all the time.

[1057] I used to read 10 books every week when I was in sixth grade or whatever.

[1058] I was a lot of it science fiction, a lot of it, a lot of history, but I love to read.

[1059] But anyway, you should recalculating God.

[1060] I think you'll, you'll, it's very easy read, like I said, and I think you'll enjoy sort of the ideas that it presents.

[1061] Yeah, I think the thought experiment is quite interesting.

[1062] One thing I've noticed about people growing up now, I mean, we'll talk about social media, but video games is a much bigger, bigger and bigger and bigger part of their lives, and the video games have become much more realistic.

[1063] I think it's possible that the three of us are not, maybe the two of you are not familiar exactly with the numbers we're talking about here.

[1064] I think the number of people...

[1065] It's bigger than movies, right?

[1066] It's huge.

[1067] I used to do a lot of the computational narrative stuff.

[1068] I understand that economists can actually see the impact of video games on the labor market.

[1069] That there are, there's fewer young men of a certain age participating in, like, paying jobs than you'd expect.

[1070] and that they trace it back to video games.

[1071] I mean, the problem with Star Trek was not warp drive or teleportation.

[1072] It was the holodeck.

[1073] Like, if you have the holodeck, that's it.

[1074] That's it.

[1075] You go in the holodeck, you never come out.

[1076] I mean, it just never made.

[1077] Once I saw that, I thought, okay, well, so this is the end of humanity, as we know it, right?

[1078] They've invented the holodeck.

[1079] Because that feels like the singularity, not some AGI or whatever.

[1080] It's some possibility to go into another world.

[1081] that can be artificially made better than this one.

[1082] And slowing it down so you live forever, or speeding it up so you appear to live forever, or making the decision of when to die.

[1083] And then most of us will just be old people on the porch yelling at the kids these days in their virtual reality.

[1084] But they won't hear us because they've got headphones on.

[1085] So, I mean, rewinding back to MOOCs, is there lessons that you've speaking of kids these days?

[1086] There you go.

[1087] That was a transition.

[1088] That was fantastic.

[1089] I'll fix it in post.

[1090] That's Charles' favorite phrase.

[1091] Fix it in post?

[1092] Fix it in post?

[1093] When we were recording all the time, whenever the editor didn't like something or whatever, I would say, we'll fix it in post.

[1094] He hated that.

[1095] He hated that more than anything.

[1096] Because it was Charles's way of saying, I'm not going to do it again.

[1097] You're on your own for this one.

[1098] But it always got fixed in post.

[1099] Exactly.

[1100] So is there something?

[1101] you've learned about, I mean, it's interesting to talk about MOOCs.

[1102] Is there something you've learned about the process of education, about thinking about the present?

[1103] I think there's two lines of conversation to be had here is the future of education in general that you've learned about.

[1104] And more presciently is the education in the times of COVID.

[1105] Yeah.

[1106] The second thing in some ways matters more than the first, for at least in my head for the, not just because it's happening now, but because I think it's reminded of a lot of things.

[1107] Coincidentally, today, there's an article out by a good friend of mine, who's also a professor at Georgia Tech, but more importantly, a writer and editor at the Atlantic, I name Ian Bogos.

[1108] And the title is something like Americans will sacrifice anything for the college experience.

[1109] And it's about why we went back to college and why people wanted us to go back to college.

[1110] And it's not, you know, greedy presidents trying to get the last dollar from someone.

[1111] they want to go to college.

[1112] And what they're paying for is not the classes.

[1113] What they're paying for is the college experience.

[1114] It's not the education.

[1115] It's being there.

[1116] I believe this for a long time, that we continually make this mistake of people want to go back to college as being people want to go back to class.

[1117] They don't.

[1118] They want to go back to campus.

[1119] They want to move away from home.

[1120] They want to do all those things that people experience.

[1121] It's a right of passage.

[1122] It's a identity, if I can steal some of Ian's words here.

[1123] And I think that's right.

[1124] And I think what we've learned through COVID, is it has made it the disaggregation was not the disaggregation of the education from the university place and that you can get the best anywhere you want to in terms of there's lots of reasons why that is not necessarily true the disaggregation is having it shoved in our faces that the reason to go again that the reason to go to college is not necessarily to learn it's to have the college experience and that's very difficult for us to accept even though we behave that way most of us when we were undergrads you know A lot of us didn't go to every single class.

[1125] We learned and we got it and we look back on and we're happy we had the learning experience as well, obviously particularly us because this is the kind of thing that we do.

[1126] And my guess is that's true of the vast majority of your audience.

[1127] But that doesn't mean the I'm standing in front of you telling you this is the thing that people are excited about.

[1128] And that's why they want to be there, primarily why they want to be there.

[1129] So to me, that's what COVID has forced us to deal with, even though I think we're still all in deep denial about.

[1130] it and hoping that it'll go back to that and I think about 85 % of it will we'll be able to pretend that that's really the way it is again and we'll forget the lessons of this but technically what'll come out of it yeah or technologically will come out of it is a way of providing a more dispersed experience through online education and these kinds of remote things that we've learned and we'll have to come up with new ways to engage them in the experience of college which includes not just the parties or the whatever kids do but the learning part of it so that they actually come out four or five or six years later with having actually learning something.

[1131] So I think the world will be radically different afterwards, and I think technology will matter for that, just not in the way that the people who were building the technology originally imagined it would be.

[1132] And I think this would have been true, even without COVID, but COVID has accelerated that reality.

[1133] So it's happening in two or three years or five years as opposed to 10 or 15.

[1134] That was an amazing answer that I did not understand.

[1135] It was passionate And shots fired But I don't know I just didn't No I'm not trying to criticize it I think I'm I don't think I'm getting it So you mentioned disaggregation So what's that?

[1136] Well so you know The power of technology If you go on the West Coast and hang out long enough It's all about we're going to disaggregate these things together The books from the bookstore You know that kind of a thing And then suddenly Amazon controls the universe Right and technology is a disruptor Right and people have been predicting that For higher education for a long time But certainly ends up So is this the sort of idea like students can aggregate on a campus someplace and then take classes over the network anywhere.

[1137] Yeah, this is what people thought was going to happen, or at least people claimed what's going to happen, right?

[1138] Because my daughter is essentially doing that now.

[1139] She's on one campus, but learning in a different campus.

[1140] Sure, and COVID makes that possible, right?

[1141] COVID makes that all but avoidable, right?

[1142] But the idea originally was that, you know, you and I were going to create this machine learning class and it was going to be great, and then no one else would be the machine learning class everyone takes, right?

[1143] was never going to happen, but, you know, something like that, you could see happening.

[1144] But I feel you didn't address that.

[1145] So why, why is it that?

[1146] Why?

[1147] I don't think that will be the thing that happens.

[1148] So the college experience, maybe I, maybe I missed what the college experience was.

[1149] I thought it was peers, like people hanging around.

[1150] A large part of it is peers.

[1151] Well, it's peers and independence.

[1152] Yeah, but you can do classes online for all of that.

[1153] No, no, no, because we're social people, right?

[1154] So one would take the classes, that also has to be part of an experience.

[1155] It's in a context, and the context is the university.

[1156] And by the way, it actually matters that Georgia Tech really is different from Brown.

[1157] I see, because then students can choose the kind of experience they think is going to be best for them.

[1158] Okay.

[1159] I think we're giving too much agency to the students in making an informed decision.

[1160] Okay.

[1161] But the truth, but yes, they will make choices and they will have different experiences.

[1162] And some of those choices will be made for them.

[1163] Some of them will be choices they're making because they think it's this, that or the other.

[1164] I just don't want to say, I don't want to give the idea.

[1165] It's not homogenous.

[1166] Yes, it's certainly not homogenous, right?

[1167] I mean, Georgia Tech is different from Brown.

[1168] Brown is different from pick your favorite state school in Iowa.

[1169] Iowa State, okay?

[1170] Which I guess is my favorite state school in Iowa.

[1171] Sure.

[1172] But, you know, these are all different.

[1173] They have different contexts.

[1174] And a lot of those contexts are, they're about history, yes, but they're also about the location of where you are.

[1175] They're about the larger group of people who are around you, whether you're in Athens, Georgia, and you're basically the only thing that's there as a university.

[1176] you're responsible for all the jobs, or whether you're at Georgia State University, which is an urban campus, where you're surrounded by, you know, six million people in your campus where it ends and begins in the city, ends and begins, we don't know.

[1177] It actually matters, whether it was small campus or a large campus.

[1178] I mean, these things matter.

[1179] Why is it that if you go to Georgia Tech, you're, like, forever proud of that?

[1180] And you, like, say that to people at dinner, like, bars and whatever.

[1181] And if you, not to, you know, if you get a degree, and an online university somewhere, you don't, that's not a thing that comes up at a bar.

[1182] Well, it's funny you say that.

[1183] So the students who take our online master's, by several measures, are more loyal than the students who come on campus, certainly for the master's degree.

[1184] The reason for that, I think, and you'd have to ask them, but based on my conversations with them, I feel comfortable saying this, is because this didn't exist before.

[1185] I mean, we talk about this online master's and that it's reaching 11 ,000 students, and that's an amazing thing, and we're admitting everyone we believe who can succeed.

[1186] We've got a 60 % acceptance rate.

[1187] It's amazing, right?

[1188] It's also a $6 ,600 degree.

[1189] The entire degree costs $6 ,600 ,000, depending on how long you take, a dollar degree, as opposed to $46 ,000, it costs you to come on campus.

[1190] So that feels, and I can do it while I'm working full -time, and I've got a family and a mortgage and all these other things.

[1191] So it's an opportunity to do something you wanted to do, but you didn't think was possible without giving up two years of your life as well as all the money and everything else in the life that you had built.

[1192] So I think we created something that's had an impact, but importantly, we gave a set of people opportunities they otherwise didn't feel they had.

[1193] So I think people feel very loyal about that.

[1194] And my biggest piece of evidence for that besides surveys is that we have somewhere north of 80 students, maybe 100 at this point, who graduated, but come back in TA for this class for basically minimum wage, even though they're working full time because they believe in sort of having that opportunity.

[1195] and they want to be a part of something.

[1196] Now, will Generation 3 feel this way?

[1197] 15 years from now, will people have that same sense?

[1198] I don't know.

[1199] But right now, they kind of do.

[1200] And so it's not the online, it's a matter of feeling as if you're a part of something, right?

[1201] We're all very tribal.

[1202] Yeah.

[1203] And I think there's something very tribal about being a part of something like that.

[1204] Being on campus makes that easier.

[1205] Going through a shared experience makes that easier.

[1206] It's harder to have that shared experience if you're alone looking at a computer screen, we can create ways to make that.

[1207] Is it possible?

[1208] It is possible.

[1209] The question is, it still is the intuition to me, and it was at the beginning when I saw something like the online master's program, is that this is going to replace universities.

[1210] And it won't replace universities.

[1211] But like, why is it?

[1212] Because it's living in a different part of the ecosystem, right?

[1213] The people who are taking it are already adults.

[1214] They've gone through their undergrad experience.

[1215] I think their goals have shifted from when they were 17.

[1216] They have other things that are going.

[1217] But it does do something really important, something very social and very important, right?

[1218] You know this whole thing about, you know, don't build the sidewalks, just leave the grass and the students or the people will walk and you put the sidewalks where they create paths.

[1219] That's interesting, yeah.

[1220] Their architects apparently believe that's the right way to do things.

[1221] The metaphor here is that we created this environment.

[1222] We didn't quite know how to think about the social aspect.

[1223] but, you know, we didn't have time to solve all, do all the social engineering, right?

[1224] The students did it themselves.

[1225] They created, you know, these groups, like on Google Plus.

[1226] They were like 30 -something groups created in the first year because somebody had these Google Plus.

[1227] And they created these groups, and they divided up in ways that made sense.

[1228] We live in the same state or we're working on the same things.

[1229] We have the same background or whatever.

[1230] And they created these social things.

[1231] We sent them T -shirts.

[1232] And we have all these great pictures of students putting on their T -shirts as they travel around.

[1233] the world.

[1234] I climbed to this mountain top.

[1235] I'm putting this t -shirt on.

[1236] I'm a part of this.

[1237] They were part of them.

[1238] They created the social environment on top of the social network and the social media that existed to create this sense of belonging and being a part of something.

[1239] They found a way to do it, right?

[1240] And I think they had other, it scratched an itch that they had, but they had scratched some of that itch that might have required they'd be physically in the same place long before.

[1241] Right.

[1242] So I think, yes, it's possible.

[1243] And it's more than possible.

[1244] And it's more than possible it's necessary.

[1245] But I don't think it's going to replace the university as we know it.

[1246] The university as we know it will change.

[1247] But there's just a lot of power in the kind of right of passage going off to yourself.

[1248] Now maybe there'll be some other right of passage that'll happen.

[1249] Right.

[1250] That's the question.

[1251] Whether you can separate.

[1252] So the university is such a fascinating mess of things.

[1253] So just even the faculty position is a fascinating mess.

[1254] Like it doesn't make any sense.

[1255] It's stabilized itself.

[1256] But like, why are the world -class researchers spending a huge amount of time of their time teaching and service?

[1257] Like, you're doing like three jobs.

[1258] Yeah.

[1259] And, and I mean, it turns, it's maybe an accident of history or human evolution.

[1260] I don't know.

[1261] It seems like the people who are really good at teaching are often really good at research.

[1262] There seems to be a parallel there.

[1263] But, like, it doesn't make any sense that you should be doing that.

[1264] At the same time, it also doesn't seem to make sense that your place where you party is the same place where you go to learn calculus or whatever.

[1265] But it's a safe space.

[1266] Safe space for everything.

[1267] Yeah, relatively speaking, it's a safe space.

[1268] Now, by the way, I feel the need very strongly to point out that we are living in a very particular weird bubble, right?

[1269] Most people don't go to college.

[1270] And by the way, the ones who do go to college, they're not 18 years old, right?

[1271] they're like 25 or something.

[1272] I forget the numbers.

[1273] You know, the places where we've been, where we are, they look like whatever we think the traditional movie version of universities are.

[1274] But for most people, it's not that way at all.

[1275] By the way, most people who drop out of college, it's entirely for financial reasons, right?

[1276] So, you know, we were talking about a particular experience.

[1277] And so for that set of people, which is very small, but larger than it was a decade or two or three or four, certainly ago.

[1278] I don't think that will change.

[1279] My concern, which I think is kind of implicit in some of these questions, is that somehow we will divide the world up further into the people who get to have this experience and get to have the network and they sort of benefit from it and everyone else while increasingly requiring that they have more and more credentials in order to get a job as a barista, right?

[1280] You got to have a master's degree in order to work at Starbucks.

[1281] We're going to force people to do these things, but they're not going to get to have that experience and there'll be a small group of people who do continue to, you know, positive feedback, etc., etc., etc. I worry a lot about that, which is why for me, and by the way, here's an answer to your question about faculty, which is why to me that you have to focus on access and the mission, I think the reason, whether it's good, bad, or strange, I mean, I agree it's strange.

[1282] But I think it's useful to have the faculty member, particularly at large R1 universities where we've all had experiences, that you tie what they get to do and with the funding mission of the university and let the mission drive.

[1283] What I hear when I talk to faculty is they love their PhD students because they're creating, they're reproducing basically, right?

[1284] And it lets them do their research and multiply.

[1285] But they understand that the mission is the undergrads.

[1286] And so they will do it without complaint mostly because it's a part of the mission and why they're here.

[1287] And they have experiences within themselves.

[1288] And it was important to get them, we'll get them where they were going.

[1289] The people tend to get squeezed in that, by the way, are the master students, right?

[1290] who are neither the PhDs who are like us nor the undergrads we have already bought into the idea that we have to teach though that's increasingly changing anyway i think tying that mission in really matters and it gives you a way to unify people around making it an actual higher calling education feels like more of a higher calling to me than than even research because education you cannot treat it as a hobby if you're going to do it well but that's the that's the pushback on this whole system is that you should education be a full -time job, right?

[1291] And almost like research is a distraction from that.

[1292] Yes, although I think most of our colleagues, many of our colleagues would say that research is the job and education is the distraction.

[1293] Right, but that's the beautiful dance.

[1294] It seems to be that tension in itself seems to work, seems to bring out the best in the faculty.

[1295] Or like the ones I've done.

[1296] But I will point out two things.

[1297] One thing I'm going to point out, the other thing I want Michael to point out because I think Michael is much closer to the ideal professor in some sense than I am.

[1298] You're the platonic sense of a professor.

[1299] I don't know what he meant by that, but he is a dean, so he has a different experience.

[1300] I'm giving him time to think of the profound thing he's going to say.

[1301] But let me point this out, which is that we have lecturers in the College of Computing where I am.

[1302] There's 10 or 12 of them, depending on you count, as opposed to the 90 or so tenure track faculty.

[1303] Those 10 lecturers who only teach, well, they don't only teach.

[1304] They also do service.

[1305] Some of them do research as well, but primarily they teach.

[1306] They teach 50 %, over 50 % of our credit hours.

[1307] And we teach everybody.

[1308] So they're doing not just, they're doing more than eight times the work of the tenure track faculty by just if more closer to nine or ten.

[1309] And that's including our grad courses, right?

[1310] So they're doing this.

[1311] They're teaching more.

[1312] they're touching more than anyone and they're beloved for it I mean so we recently had a survey we do these alumni everyone does these alumni surveys you hire someone from the outside to do whatever and and I was really struck by something you saw all these really cool numbers I'm not going to talk about it because you know it's all internal confidential stuff but one thing I will talk about is there was a single question we asked our alum and these are people who graduated you know born in the 30s and 40s all the way of the people who graduated last week right um well last semester okay good um time flies yeah time And there was a question.

[1313] Named this a single person who had a strong positive impact on you, something like that.

[1314] I think it was special impact?

[1315] Yeah, special impact on you.

[1316] And then, so they got all the answers from people, and they created a word cloud.

[1317] There was clearly a word cloud created by people who don't do word clouds for a living because they had one person whose name, like, appeared nine different times, like Philip, Phil, Dr. Phil, you know, but whatever.

[1318] But they got all this.

[1319] And I looked at it, and I noticed something really cool.

[1320] The five people from the College of Computing, I recognized, were in that cloud.

[1321] And four of them were lecturers, the people who teach.

[1322] Two of them, relatively modern.

[1323] Both were chairs of our Division of Computing Instruction, one retired, one is going to retire soon.

[1324] And the other two were lecturers I remembered from the 1980s.

[1325] Two of those four actually have...

[1326] By the way, the fifth person was Charles.

[1327] That's not important.

[1328] The thing is, I don't know.

[1329] tell people that.

[1330] But the two of those people, our teaching awards are named after.

[1331] Thank you, Michael.

[1332] Two of those are teaching awards are named after, right?

[1333] So when you ask students, alumni, people who are now 60, 70 years old even, you know, who touch them, they say the dean of students.

[1334] They say the big teachers who taught the big introductory classes that got me into it.

[1335] There's a guy named Richard Bark who's on there, who's, you know, who's known as a great teacher, the Phil Adler guy who, who I probably just said his last name wrong, but I know the first name's Phil because he kept showing up over and over again.

[1336] It's famous...

[1337] Adler is what it's head.

[1338] Okay, good.

[1339] But different people spelled it differently, so he appeared multiple times.

[1340] Right.

[1341] So he was a, clearly, he was a professor in the business school.

[1342] But when you read about him, I went to read it by because I was curious who he was, you know, it's all about his teaching and the students that he touched, right?

[1343] So whatever it is that we're doing and we think we're doing that's important or why we think the universities function, the people who go through it.

[1344] They remember the people who were kind to them, the people who taught them something.

[1345] And they do remember it.

[1346] They remember it later.

[1347] I think that's important.

[1348] So the mission matters.

[1349] Yeah.

[1350] Not to completely lose track of the fundamental problem of how do we replace the party aspect of universities.

[1351] Before we go to what makes the platonic professor, do you think, like, what in your sense is the role of MOOCs in this whole picture during COVID?

[1352] like are we should we desperately be clamoring to get back on campus or is this a stable place to be for a little while i don't know i know that it's that it's the online teaching experience and learning experience has been really rough i think that that people find it to be a struggle in a way that's not a happy positive struggle that when you got through it you just feel like glad that it's over as opposed to i've achieved something so you know i worry about that But, you know, I worry about just even before this happened, I worry about lecture teaching as how well is that actually really working as far as a way to do education as a way to inspire people.

[1353] I mean, all the data that I'm aware of seems to indicate, and this kind of fits, I think, with Charles' story, is that people respond to connection, right?

[1354] They actually feel, if they feel connected to the person teaching the class, they're more likely to go along with it.

[1355] They're more able to retain information.

[1356] They're more motivated to be involved in the class in some way.

[1357] And that really matters.

[1358] People...

[1359] You mean to the human themselves.

[1360] Yeah.

[1361] So can't you do that actually perhaps more effectively online?

[1362] Like you mentioned, science communication.

[1363] So I literally, I think, learned linear algebra from Gilbert Strang by watching MIT Open Courseware when I was in drugs.

[1364] And he was a personality, it was a bit like a tiny, in this tiny little world of math, it's a bit of a rock star, right?

[1365] So you kind of look up to that person.

[1366] Can't that replace the in -person education?

[1367] It can help.

[1368] I will point out something.

[1369] I can't share the numbers, but we have surveyed our students.

[1370] And even though they have feelings about what I would interpret as connection, I like that word, in the different modes of classrooms, there's no difference.

[1371] between how well they think they're learning.

[1372] For them, the thing that makes them unhappy is the situation they're in.

[1373] And I think the lack of connection, it's not whether they're learning anything.

[1374] They seem to think they're learning something anyway, right?

[1375] And in fact, they seem to think they're learning it equally well, presumably because the faculty are putting in, or the instructors, more generally speaking, are putting in the energy and effort to try to make certain that they're, what they've curated can be expressed to them in a useful way.

[1376] But the connection is missing.

[1377] And so there's huge differences in what they prefer.

[1378] And as far as I can tell, what they prefer is more connection, not less.

[1379] That connection just doesn't have to be physically in a classroom.

[1380] I mean, look, I used to teach 348 students on a machine learning class on campus.

[1381] Do you know why?

[1382] That was the biggest classroom on campus.

[1383] They're sitting in a theater, they're sitting in theater seats.

[1384] I'm literally on a stage looking down on them and talking to them, right?

[1385] There's no, I mean, we're not sitting down having a one -on -one conversation, reading each other's body language, trying to communicate and going, we're not doing any of that.

[1386] So, you know, if you're on the, if you're past the third row, it might as well be online anyway is the kind of thing that people have said, Daphne has actually said some version of this, that online starts on the third row or something like that.

[1387] And I think that's not, yeah, I like it.

[1388] I think it captures something important.

[1389] But people still came, by the way.

[1390] Even the people who had access to our material would still come to class.

[1391] I mean, there's a certain element about looking to the person next to you.

[1392] Yeah.

[1393] It's just like their presence there, their boredom and like when the parts are boring and their excitement, when the parts are exciting, like, and sharing in that, like unspoken kind of, yeah, communication.

[1394] Like, in part, the connection is with the other people in the room.

[1395] Watching the circus on TV alone is not really.

[1396] Have you ever been to a movie theater and been the only one there at a comedy?

[1397] It's not as funny as when you're in a room.

[1398] full of people all laughing.

[1399] Well, you need, maybe you need just another person.

[1400] It's like, as opposed to many.

[1401] Maybe, maybe there's some kind of...

[1402] Well, there's different kinds of connection, right?

[1403] And there's different kinds of comedy.

[1404] Well, in the sense that...

[1405] As we're learning today.

[1406] I wasn't sure if that was going to land.

[1407] But just the idea that different jokes, I've now done a little bit of stand -up.

[1408] And so different jokes work in different size crowds, too.

[1409] No, that's true.

[1410] Where sometimes if it's a big enough crowd, then even a really subtle joke can take root someplace and then that cues other people.

[1411] And it kind of, there's a whole statistics of, I did this terrible thing to my brother.

[1412] So when I was really young, I decided that my brother was only laughing as it comes when I laughed.

[1413] Like he was taking cues from me. So I, like, purposely didn't laugh just to see if I was right.

[1414] And did you laugh at non -funny things?

[1415] Yes.

[1416] You really wanted to do both sides.

[1417] I did both sides.

[1418] And at the end of it, I told him what I did.

[1419] He was very upset about this.

[1420] And from that day on...

[1421] He lost his sense of humor.

[1422] No, no, no, no. Well, yes.

[1423] But from that day on, he laughed on his own.

[1424] He stopped taking cues from me. So I want to say that, you know, it was a good thing that I did.

[1425] Yes, yes.

[1426] It was mostly mean.

[1427] Yes, but it was mostly mean.

[1428] But it's true, though.

[1429] It's true, right?

[1430] That people...

[1431] I think you're right.

[1432] But, okay, so where does that get us?

[1433] That gets us the idea that...

[1434] I mean, certainly movie theaters are a thing, right, where people like to be watching together, even though the people on the screen aren't really co -present with the people in the audience.

[1435] The audience is co -present with itself.

[1436] By the way, in that point, it's an open question that's being raised by this, whether movies will no longer be a thing because Netflix's audience is growing.

[1437] So that's, it's a very parallel question for education.

[1438] Will movie theater still be a thing in 2021?

[1439] No, but I think the argument is that there is a feeling, of being in the crowd that isn't replicated by being at home watching it and that there's value in that.

[1440] And then I think just...

[1441] But...

[1442] But...

[1443] It scales better on the line.

[1444] But I feel like we're having a conversation about whether concerts will still exist after the invention of the record or the CD or wherever it is, right?

[1445] They won't.

[1446] You're right.

[1447] Concerts are dead.

[1448] Well, okay, I think the joke is only funny if you say it before now.

[1449] right yeah like three years ago it's like well no obviously I'll wait to publish this until we have a vaccine you know we'll fix it in post but I think the important thing is the virus post concerts changed right first of all movie theaters weren't this way right in like the 60s and 70s they weren't like this like blockbusters were basically what with Jaws and Star Wars created blockbusters right before then there weren't like the whole shared summer experience didn't exist in our lifetimes, right?

[1450] Certainly you were well into adulthood by the time this was true, right?

[1451] So it's just a very different, it's very different.

[1452] So what we've been experiencing in the last 10 years is not like the majority of human history.

[1453] But more importantly, concerts, right?

[1454] Concerts mean something different.

[1455] Most people don't go to concerts anymore.

[1456] Like there's an age where you care about it.

[1457] You sort of stop doing it, but you keep listening to music or whatever and da -da -da -da -da -da -da.

[1458] So I think that's a painful way of saying that it will change.

[1459] It was not the same things are going away.

[1460] Replace is too strong over word.

[1461] But it will change.

[1462] It has to.

[1463] I actually, like, to push back, I wonder, because I think you're probably just throwing that your intuition out.

[1464] Oh, I definitely, very much.

[1465] It's possible that concerts, more people go to concerts now, but obviously much more people listen to, well, this dumb, than before there was records.

[1466] it's it's possible to argue that if you look at the data that it just expanded the pie of what music listening means so it's possible that like universities grow in the parallel or the theaters grow but also more people get to watch movies more people get to like be educated yeah i i hope that yeah and to the extent that we can grow the pie and have education be not just something you do for four years when you're done with your other education But it would be a more lifelong thing.

[1467] That would have tremendous benefits, especially as the economy and the world change rapidly.

[1468] Like, people need opportunities to stay abreast of these changes.

[1469] And so, I don't know, I could, I could, it's all part of the ecosystem.

[1470] It's all to the good.

[1471] I mean, you know, I'm not going to have an argument about whether we lost fidelity.

[1472] We went from Laserdisc to DVDs or record players to CDs.

[1473] I mean, I'm willing to grant that that is true.

[1474] But convenience matters and the ability to do something that you couldn't do otherwise because of that convenience matters.

[1475] And you can tell me I'm only getting 90 % of the experience, but I'm getting the experience.

[1476] I wasn't getting it before or it wasn't lasting as long or it wasn't as easy.

[1477] I mean, this just seems straightforward to me. It's going to change.

[1478] It is for the good that more people get access.

[1479] And it is our job to do two separate things.

[1480] One, to educate them and make access available.

[1481] That's our mission.

[1482] But also, for very simple selfish reasons, we need to figure out how to do it better so that we individually stay in business.

[1483] We can do both of those things at the same time.

[1484] They are not in, they may be in tension, but they are not mutually exclusive.

[1485] So you've educated some scary number of people.

[1486] So you've seen a lot of people succeed, find their path to their life.

[1487] Is there a device that you can give to a young person today about computer science education, about education in general, about life, about whatever the journey that one takes in their, maybe in their teens, in their early 20s, sort of in those underground years, as you try to go through the essential process of partying and not go into classes, and yet somehow trying to get a degree.

[1488] If you get to the point, where you're far enough up in the hierarchy of needs that you can actually make decisions like this, then find the thing that you're passionate about and pursue it.

[1489] And sometimes it's the thing that drives your life, and sometimes it's secondary.

[1490] And you'll do other things because you've got to eat, right?

[1491] You've got a family, you've got a feed, you've got people you have to help or whatever.

[1492] And I understand that, and it's not easy for everyone.

[1493] But always take a moment or two to pursue the things that you love, the things that bring passion and happiness to your life.

[1494] And if you don't, I know that sounds corny, but I genuinely believe it.

[1495] And if you don't have such a thing, then you're lying to yourself.

[1496] You have such a thing.

[1497] You just have to find it.

[1498] And it's okay if it takes you a long time to get there.

[1499] Rodney Dangerfield became a comedian in his 50s, I think.

[1500] It certainly wasn't his 20s.

[1501] And lots of people failed for a very long time before getting to where they were going.

[1502] You know, I try to have hope.

[1503] And it wasn't obvious.

[1504] I mean, you know, you and I talked about the, experience that I had a long time ago with a particular police officer.

[1505] Wasn't my first one and wasn't my last one, but you know, in my view, I wasn't supposed to be here after that and I'm here.

[1506] So it's all gravy.

[1507] So you might as well go ahead and grab life as you can because of that.

[1508] That's sort of how I see it.

[1509] While recognizing, again, the delusion matters, right?

[1510] Allow yourself to be deluded.

[1511] Allow yourself to believe that it's all going to work out.

[1512] Just don't be so deluded that you miss the obvious.

[1513] and you're going to be fine it's going to be there it's going to be there it's going to work out what do you think I like to say choose your parents wisely because that has a big impact on your life yeah I mean you know I mean there's a whole lot of things that you don't get to pick and whether you get to have you know one kind of life or a different kind of life can depend a lot on things out of your control but I really do believe in the passion and excitement thing I was talking to my mom on the phone the other day, and essentially what came out is that computer science is really popular right now.

[1514] And I get to be a professor teaching something that's very attractive to people.

[1515] And she was like trying to give me some appreciation for how foresightful I was for choosing this line of work, as if somehow I knew that this is what was going to happen in 2020.

[1516] But that's not how it went for me at all.

[1517] Like, I studied computer science because I was just interested.

[1518] It was just so interesting to me. I didn't think it would be particularly lucrative.

[1519] And I've done everything I've can to keep it as unlucrative as possible.

[1520] Some of my, you know, some of my friends and colleagues have not done that.

[1521] And I pride myself on my ability to remain unrich.

[1522] But I do believe that, like, I'm glad.

[1523] I mean, I'm glad that it worked out from here.

[1524] It could have been like, oh, what I was really fascinated by is this particular kind of engraving that nobody cares about.

[1525] So I got lucky, and the thing that I cared about happened to be a thing that other people eventually cared about.

[1526] But I don't think I would have had a fun time choosing anything else.

[1527] Like, this was the thing that kept me interested and engaged.

[1528] Well, one thing that people tell me, especially around, is that early undergraduate.

[1529] and the internet is part of the problem here is they say they're passionate about so many things.

[1530] How do I choose a thing?

[1531] Which is an harder thing for me to know what to do with.

[1532] Is there any...

[1533] I mean, don't you know what you...

[1534] I mean, you know, look, a long time ago, I walked down a hallway and I took a left turn.

[1535] Yeah.

[1536] I could have taken a right turn.

[1537] And my world could be better or it could be worse.

[1538] I have no idea.

[1539] I have no way of knowing.

[1540] Is there anything about this particular hallway that's relevant?

[1541] you're just in general choices.

[1542] Yeah, you were on the left.

[1543] It sounds like you regret not taking the right turn.

[1544] Oh, no, not at all.

[1545] You brought it up.

[1546] Well, because it was a turn.

[1547] It was a turn there.

[1548] On the left was Michael Lemon's office, right?

[1549] I mean, these sorts of things happen, right?

[1550] Yes.

[1551] But here's the thing.

[1552] On the right, by the way, there was just a blank wall.

[1553] It wasn't a huge choice.

[1554] It would have really hurt.

[1555] He tried first.

[1556] No, but it's true, right?

[1557] That, you know, I think about Ron Brockman, right?

[1558] I went, I took a trip.

[1559] I wasn't supposed to take, and I ended up talking to to run about this, and I ended up going down this entire path that allowed me to, I think, get tenure.

[1560] But by the way, I decided to say yes to something that didn't make any sense, and I went down this educational path.

[1561] But it would have been, you know, who knows, right?

[1562] Maybe if I hadn't done that, I would be a billionaire right now.

[1563] I'd be Elon Musk.

[1564] My life could be so much better.

[1565] My life could also be so much worse.

[1566] You know, you just got to feel that sometimes you have decisions you're going to make.

[1567] You cannot know what's going to do.

[1568] You should think about it, right?

[1569] Some things are clearly smarter than other things.

[1570] You've got to play the odds a little bit.

[1571] But in the end, if you've got multiple choices, there are lots of things you think you might love.

[1572] Go with the thing that you actually love, the thing that jumps out at you and sort of pursue it for a little while.

[1573] The worst thing that will happen is you took a left turn instead of a right turn and you ended up merely happy.

[1574] Beautiful.

[1575] So accepting, so taking the step on just accepting, accepting that don't like question.

[1576] I like to think that life is long and there's time to actually.

[1577] pursue every once in a while you have to put on a leather suit and make a thriller video every once in a while if I ever get a chance again I'm doing it yeah I was told that you actually dance but that part was edited out I don't dance there was a thing where we did do the yeah the zombie thing yeah we did do the zombie man that wasn't edited out it just wasn't it was put into the final thing I'm quite happy.

[1578] But there was a reason for that, too, right?

[1579] Like, I wasn't wearing something right?

[1580] There was a reason for that.

[1581] I can't remember what it was.

[1582] No, leather suit.

[1583] Is that what it was?

[1584] I can't remember.

[1585] Anyway, the right thing happened.

[1586] Exactly.

[1587] You took the left turn and it ended up being the right thing.

[1588] So a lot of people ask me that are a little bit tangential to the programming, the computing world, and they're interested to learn programming, like all kinds of disciplines that are outside of the particular discipline of computer science.

[1589] what advice do you have for people that want to learn how to program or want to either taste this little skill set or discipline or try to see if it can be used somehow in their own life?

[1590] What stage of life are they in?

[1591] It feels, well, one of the magic things about the Internet of the people that write me is, I don't know.

[1592] Because my answer is different for, my daughter is taking AP computer science right now.

[1593] Hi, Johnny.

[1594] She's amazing and doing amazing things, and my son's beginning to get interested, and I'll be really curious where he takes it.

[1595] I think his mind actually works very well for this sort of thing, and she's doing great.

[1596] But one of the things I have to tell her all the time, she points, well, I want to make a rhythm game.

[1597] So I want to go for two weeks and then build a rhythm game, show me how to build a rhythm game.

[1598] Start small.

[1599] Learn the building blocks and how we take the time.

[1600] Have patience.

[1601] Eventually, you'll build a rhythm game.

[1602] I was in grad school, when I suddenly woke up one day over the Royal East.

[1603] And I thought, wait a minute, I'm a computer scientist.

[1604] I should be able to write Pac -Man in an afternoon.

[1605] And I did, not with great graphics.

[1606] It was actually a very cool game.

[1607] I had to figure out how the ghost moved and everything, and I did it in an afternoon in Pascal, on an old Apple 2GS.

[1608] But if I had started out trying to build Pac -Man, I think it probably would have ended very poorly for me. Luckily, back then, there weren't these magical devices we call phones and software everywhere to give me this illusion that I could create something.

[1609] by myself from the basics inside of a weekend like that.

[1610] I mean, that was a culmination of years and years and years, right before I decided, oh, I should be able to write this, and I could.

[1611] So, you know, my advice, if you're early on is, you know, you've got the Internet.

[1612] There are lots of people there to give you the information.

[1613] Find someone who cares about this.

[1614] Remember, they've been doing it for a very long time.

[1615] Take it slow, learn the little pieces, get excited about it, and then keep the big project you want to build in mind.

[1616] You'll get there soon enough because, as a wise man once said, life is long.

[1617] Sometimes it doesn't seem that long, but it is long, and you'll have enough time to build it all out.

[1618] All the information is out there, but starts small, you know, generative and object numbers.

[1619] That's not exciting, but it'll get you a programming language.

[1620] Well, there's only one programming language.

[1621] But if you have to pick a programming language, I guess in today's, what would I do?

[1622] I guess I do.

[1623] Python is basically this, but with better syntax.

[1624] Blasphemy.

[1625] Yeah, with C syntax.

[1626] How about that?

[1627] So you're going to argue that C syntax is better than anything?

[1628] Anyway, also, I'm going to answer Python despite what he said.

[1629] Tell me, tell your story about somebody's dissertation that had a LISP program in it.

[1630] It was so funny.

[1631] This is Dave's, Dave's Listeration, was like Dave McAllister, who was a professor at MIT for a while.

[1632] And then he came in our group, Bell Labs.

[1633] Now he's at Technology Technical Institute of Chicago.

[1634] A brilliant guy.

[1635] Such an interesting guy.

[1636] Anyway, his thesis, it was a theorem prover, and he decided to have, as an appendix, his actual code, which, of course, was all written in Liszt because, of course, it was.

[1637] And, like, the last 20 pages are just right parentheses.

[1638] It's just wonderful.

[1639] That's programming right there.

[1640] Pages up on pages of right parentheses.

[1641] Anyway, Lisp is the only real language, but I understand that that's not necessarily the place where you start.

[1642] Python is just fine.

[1643] Python is good.

[1644] If you're of a certain age, if you're really young and trying to figure it out graphical languages that let you kind of see how the thing works, and that's fine too.

[1645] They're all fine.

[1646] It almost doesn't matter.

[1647] But there are people who spend a lot of time thinking about how to build languages that get people in.

[1648] The question is, are you trying to get in and figure out what it is, or do you already know what you want?

[1649] And that's why I asked you what stage of life people are in, because if you're different stages of life, you would attack it differently.

[1650] The answer to that question of which language keeps changing.

[1651] I mean, there's some value to exploring.

[1652] a lot of people write to me about Julia there's these more modern languages that keep being invented rust and Kotlin there's stuff that for people who love functional languages like Lisp that apparently there's echoes of that but much better in the model languages and it's worthwhile to especially when you're learning languages it feels like it's okay to try one that's not like the popular one.

[1653] Oh yeah but you And I think you get that way of thinking almost no matter what language.

[1654] And if you push far enough, like it can be assembly language, but you need to push pretty far before you start to hit the really deep concepts that you would get sooner in other languages.

[1655] But like, I don't know, computation is kind of computation, is kind of touring equivalent, is kind of computation.

[1656] And so it matters how you express things, but you have to build out that mental structure in your mind.

[1657] And I don't think it's super matters which language.

[1658] I mean, it matters a little because some things are just at the wrong level of abstraction.

[1659] I think assemblies at the wrong level of abstraction for someone coming in new.

[1660] I think that if you start...

[1661] For someone coming in new.

[1662] Yes.

[1663] For frameworks, big frameworks are quite a bit.

[1664] You know, you've got to get to the point where I want to learn a new language.

[1665] I just pick of a reference book and I think of a project and I go through it in a weekend.

[1666] You've got to get there.

[1667] You're right, though.

[1668] The languages that are designed for that are...

[1669] It almost doesn't matter.

[1670] Pick the ones that people have built tutorials and infrastructure around.

[1671] to help you get kind of, kind of ease into it.

[1672] Because it's hard, I mean, I did this little experiment with.

[1673] I was teaching intro to CS in the summer as a favor.

[1674] Which is, anyway, I was teaching intro to see us as a favor.

[1675] And it was very funny because I'd go in every single time and I would think to myself, how am I possibly gonna fill up an hour and a half talking about four loops, right?

[1676] And there wasn't enough time.

[1677] It took me while to realize this, right?

[1678] There were only three things, right?

[1679] There's reading from a variable, writing to a variable, and conditional branching.

[1680] Everything else is syntactic sugar, right?

[1681] The syntactic sugar matters, but that's it.

[1682] And when I say that's it, I don't mean it's simple.

[1683] I mean, it's hard.

[1684] Like, conditional branching, loops, variable.

[1685] Those are really hard concepts.

[1686] So you shouldn't be discouraged by this.

[1687] Here's a simple experiment around.

[1688] I'm going to ask you a question now.

[1689] You ready?

[1690] X equals three.

[1691] Okay.

[1692] Y equals four.

[1693] Okay.

[1694] What is X?

[1695] Three.

[1696] What is one?

[1697] Y?

[1698] Four.

[1699] Why equals X?

[1700] Oh, it's easy to.

[1701] Y equals X. Why equals X?

[1702] What is Y?

[1703] Three.

[1704] That's right.

[1705] X equals seven.

[1706] What is Y?

[1707] That's one of the trickiest things to get for programmers that there's a memory and the variables are pointing to a particular thing in memory.

[1708] And sometimes the language is hide that from you and they bring it closer to the way you think mathematics works.

[1709] Right.

[1710] So in fact, Mark Gussdow, who worries about.

[1711] these sorts of things, or used to worry about these sorts of things anyway, had this kind of belief that actually people, when they see these statements, X equals something Y equals something Y equals X, that you have now made a mathematical statement that Y and X are the same.

[1712] Which you can if you just put like an anchor in front of it.

[1713] Yes, but people, that's not what you're doing, right?

[1714] I thought, and I kind of asked the question and I think I had some evidence for this, I'm hardly a study, is that most of the people who didn't know the answer, or weren't sure about the answer, they had used spreadsheets.

[1715] Ah, interesting.

[1716] And so it's a name, it's, you know, it's by reference or by name, really, right?

[1717] And so, depending upon what you think, they are, you get completely different answers.

[1718] The fact that I could go, or one could go, two -thirds of the way through a semester, and people still hadn't figured out in their heads, when you say Y equals X, what that meant, tells you it's actually hard.

[1719] Because all those answers are, possible.

[1720] And in fact, when you said, oh, if you just put an ampersand in front of it, I mean, that doesn't make any sense for an intro class.

[1721] And of course, a lot of language don't even give you the ability to think about it in terms of ampersand.

[1722] Do we want to have a 45 -minute discussion about the difference between equal, EQ and equal in Lisp?

[1723] I know you do.

[1724] I will.

[1725] But, you know, you could do that.

[1726] This is actually really hard stuff.

[1727] So you shouldn't be, it's not too hard.

[1728] We all do it.

[1729] But you shouldn't be discouraged.

[1730] It's why you should start small so that you can figure out these things.

[1731] You have the right model in your head so when you write the language, you can execute it and build the machine that you want to build, right?

[1732] Yeah, the funny thing about programming on those very basic things is the very basics are not often made explicit, which is actually what drives everybody away from basically any discipline, but programming is just another one.

[1733] Like, even a simpler version of the equal sign that I kind of forget is in mathematics equals is not assignment.

[1734] Yeah.

[1735] Like, I think basically, every single programming language with just a few handful of exceptions equals is assignment and you have some other operator for equality yeah and you know even that like everyone kind of knows it once you started doing it but like you need to say that explicitly or you just realize it like yourself otherwise you might be stuck for you said like half a semester you could be stuck for quite a long time and I think also part of the programming is being okay in that state of confusion for a while it's the it's to the debugging point it's like I just wrote two lines of code why doesn't this work and staring at that for like hours and trying to figure out and then every once in a while you just have to restart your computer and everything works again and then and then you just kind of stare into the void with the tear slowly rolling down your eye.

[1736] By the way, the fact that they didn't get this actually had no impact on, I mean, they were still able to do their assignments.

[1737] Right.

[1738] Because it turns out their misunderstanding wasn't being revealed to them.

[1739] Yes.

[1740] By the problem sets we were.

[1741] It's profound, actually, yeah.

[1742] I wrote a program a long time ago, actually, for my master's thesis, and in C++, I think, or C, I guess it was C. And it was all memory management and terrible.

[1743] And it wouldn't work for a while.

[1744] And it was some kind of, it was clear to me that it was overriding memory.

[1745] And I just couldn't, I was like, look, I got a paper deadline for this.

[1746] So I basically declared a variable at the front and the main that was like 400K, just an array.

[1747] And it worked.

[1748] Because wherever I was scribbling over memory, it would scribble into that space and it didn't matter.

[1749] And so I never figured out what the bug was.

[1750] But I did create something to sort of deal with it.

[1751] To work around it.

[1752] And it, you know, that's crazy.

[1753] That's crazy.

[1754] It was okay because that's what I wanted.

[1755] But I knew enough about memory managed to go, you know, management to go, you know, I'm just going to create an empty array here and hope that that deals with the scribbling memory problem.

[1756] And it did.

[1757] That takes a long time to figure out.

[1758] And by the way, the language you first learned probably does garbage collection anyway, so you're not even going to come up across.

[1759] You're not going to come across that problem.

[1760] So we talked about the Minsky idea of hating everything you do and hating yourself.

[1761] So let's end on a question that's going to make both of you very uncomfortable.

[1762] Okay.

[1763] Which is, what is your, Charles, what's your favorite thing that you're grateful for about Michael?

[1764] And Michael, what is your favorite thing that you're grateful for about Charles?

[1765] Well, that answer is actually quite easy.

[1766] His friendship.

[1767] He stole the easy answer.

[1768] I did.

[1769] Yeah, I can tell you what I hate about Charles.

[1770] He steals my good answers.

[1771] the thing I like most about Charles, he sees the world in a similar enough but different way that it's sort of like having another life.

[1772] It's sort of like I get to experience things that I wouldn't otherwise get to experience because I would not naturally gravitate to them that way.

[1773] And so he just shows me a whole other world.

[1774] It's awesome.

[1775] Yeah.

[1776] The inner product is not zero for sure.

[1777] It's not quite one.

[1778] 0 .7 maybe.

[1779] Just enough that you can learn.

[1780] just enough that you can learn that's the definition of friendship the inner product is 0 .7 yeah I think so that's the answer to life really Charles sometimes believes in me when I have not believed in me he also sometimes works as an outward confidence that he has so much so much confidence and self I don't know comfortableness okay let's go with that that I feel better a little bit if he thinks I'm okay then maybe I'm not as bad as I think I am.

[1781] At the end of the day, luck favors the Charles.

[1782] It's a huge honor to talk with you.

[1783] Thank you so much for taking this time, wasting your time with me. It was an awesome conversation.

[1784] You guys are an inspiration to a huge number of people and to me, so really enjoyed this.

[1785] Thanks to Tom.

[1786] I enjoyed this as well.

[1787] Thank you so much.

[1788] And by the way, if luck favors the Charles, then it's certainly the case that I've been very lucky to know you.

[1789] I'm going to add that part out.

[1790] Thanks for listening to this conversation with Charles Isbell and Michael Littman And thank you to our sponsors Athletic Greens, Super Nutritional Drink, 8th Sleep, Self -Cooling Mattress, Masterclass online courses From some of the most amazing humans in history And Cash App The app I use to send money to friends.

[1791] Please check out the sponsors in the description To get a discount and to support this podcast.

[1792] If you enjoy this thing, subscribe on YouTube, review it with Five Stars and Apple Podcast, follow on Spotify, support it on Patreon, or connect with me on Twitter at Lex Friedman.

[1793] And now let me leave you some words from Desmond Tutu.

[1794] Don't raise your voice.

[1795] Improve your argument.

[1796] Thank you for listening and hope to see you next time.