The Diary Of A CEO with Steven Bartlett XX
[0] Learn from your swipes on Instagram.
[1] Your brain does what you tell it to do.
[2] You're the boss.
[3] Tell it.
[4] AI is going to be a billion times smarter than humans.
[5] I would take data points and measurements and topics like happiness.
[6] Gratitude is the ultimate solution to the happiness equation.
[7] Mo Gaudat, he is an expert on the topic of happiness.
[8] If everyone in the world listen to this podcast episode, the world would be a drastically better place.
[9] I was chief business officer of Google X. Through that network, I've connected with.
[10] the wisest people on the planet.
[11] We have an app coming out in Christmas that is aiming to get to the point where we know exactly why you're unhappy.
[12] Literally the simplest surgical operation not known to humankind, but five mistakes happened and four hours later, Ali was gone.
[13] There's nothing I can do to bring him back, but I can make his essence alive.
[14] My intention shifted from spending the rest of my life in grief to actually writing what he taught me so that I can share it with the world.
[15] Mo Gaudat.
[16] You know, I've done this podcast for the last 12 months every week.
[17] And there's one name which my guests, the people that sit in front of me, the successful athletes, entrepreneurs, business people from all walks of life and just generally ambitious, successful people kept saying, and it was Mo's name.
[18] You know, I hype up these episodes a lot, but I've never said this.
[19] This was my favorite podcast of all time because of the lasting value that I know it will have on my life.
[20] I think I cried twice in this podcast episode.
[21] Who is Mo?
[22] He's a genius business person, so smart in facts that Google made him the head of Google X, which was their special projects division where they do the most crazy, insane things from flying cars to machine learning, anything a genius would be capable of doing.
[23] He's also a remarkable entrepreneur, but the thing that will bring the most value to you in this episode if you listen to it will be what he says about happiness.
[24] And some of the things he says today have just created these like personal revelations in my head where I genuinely feel that I have to go and sit down in a room alone and think about them for the next couple of weeks, genuinely life -changing.
[25] And you've never heard me this enthusiastic on the podcast.
[26] So if you're ever going to trust me with an episode, trust me on this one.
[27] Are you ready?
[28] I hope you are.
[29] Without further ado, I'm Stephen Bartlett, and this is the diary of a CEO.
[30] I hope nobody's listening.
[31] But if you are, then please keep this to yourself.
[32] I guess my first question for you is, because, you know, when I look at the things that you write about, the topics you speak about so often, the businesses you've built, the areas of interests you have, and I see that they're so diverse, and also they're very smart, shall I say.
[33] Oh, thank you.
[34] My question is, what were those early personal but also early professional experiences that have shaped the way you see and analyze the world?
[35] What is that context that we need to know about you?
[36] I think the thing that maybe shapes me most is that I was born in the East, raised in the East with the culture of the East and educated in the West and worked in the West with the culture of the West.
[37] And in a very unusual way, I didn't judge either.
[38] I think there is so much value to learn in each of them, but they're almost exact opposites.
[39] And to be able to embrace both of them maybe has allowed me to translate concepts that are normally spoken about on one more than the other to the other.
[40] So most of my work really is highly dependent on my early math, you know, love of mathematics, love of physics.
[41] I'm a very serious geek.
[42] I don't say that in public because it affects my, I just said it.
[43] It affects my CEO job, but I'm really geeky, like, to the point that I was writing codes until a few years ago.
[44] But I take all of that language of being very organized, very systemic, almost, you know, engineered in everything.
[45] And I try to explain concepts like spirituality, like love, like humanity.
[46] humanity's position in the modern age and so on and so forth.
[47] And I explain them in slightly unusual ways.
[48] You know, I use, for example, quantum physics and theory of relativity to try and explain death.
[49] I use mathematics and theory of probability to discuss the question of the presence of a divine being and so on and so forth.
[50] And I think the thing is, I have a brain defect somewhere that basically does not stop me from addressing crazy ideas.
[51] So I normally am writing six books at the same time and and and i love it i love it i don't write for for you sadly i i hate to say this i write for me so i get inspired by a topic and then i build a structure literally like we write subroutines in software i write the entire you know um flowchart if you want of the book and then leave it on my desktop and then start to work on it sometimes for a year sometimes for two years and and eventually something comes out that informs me and enriches me and And at the same time, you know, interests people.
[52] You write for you.
[53] Oh, absolutely.
[54] So why did you write a book about happiness?
[55] That's the most selfish thing I've ever done in my life.
[56] I mean, if you know my story was, so I was very successful at a very young age.
[57] So unusual.
[58] I mean, I'm born and raised in Egypt, educated in public school, public university in Egypt.
[59] So my biggest dream was I was going to become sales manager in IBM, Egypt.
[60] That was my biggest dream.
[61] And look at what happened.
[62] I mean, I went through beyond my wildest expectations.
[63] Chief Business Officer of Google X is literally the second best job on the planet.
[64] Okay.
[65] And, and, you know, I, I had all of this, you know, that people dream of at a very young age, at 29.
[66] I had, you know, the big villa with the swimming pool, you know, all of the money, all of the suits, all of the luxury cars.
[67] From 25 when I had nothing to 29 when I had everything, I had the most wonderful woman in my life, beautiful, wise, sensitive, loving, who gave me two wonderful kids, and I was clinically depressed.
[68] And it's not an unusual story, where we keep chasing all of those things.
[69] I mean, my luck was that I hit my middle age crisis at 29 when I achieved everything they told me I was supposed to achieve and couldn't find happiness.
[70] And so I ended up in a place where I started to research the topic, just like I would anything else, and I couldn't get a word.
[71] I just couldn't get it.
[72] You know, they told me to meditate.
[73] My engineering mind was like, tell me why, explain something to me. Tell me why it works, right?
[74] You know, if they told me to say om, I would get really angry.
[75] I still don't say om, right?
[76] But the idea is, is my brain wouldn't get it.
[77] And instead of me rejecting that, I started to look at those topics as an engineer.
[78] So I would start to do literally, you know, like the scientific method, I would take data points and measurements and try to do fitting lines and curves and charts on topics like happiness.
[79] And, you know, it started to work for me. So four years in, I started to really become a little better.
[80] And I would go back to my wonderful son, Ali, who was born a tiny little Zen monk.
[81] He knew those things instinctively.
[82] And even as a young child, you know, age eight, I think was when I started to discuss those things with him, he would listen there and ask me a couple of questions clearly to entertain me and then basically say, well done, papa, this is amazing.
[83] You could have just asked me, okay?
[84] And then he would literally explain it to me from the heart.
[85] So how the heart feels it.
[86] I would get how the left brain sees it and he would get how the heart feels it.
[87] Your son, at eight.
[88] He was so wise, Stephen.
[89] was so wise.
[90] Ali, when he was 16, I promise you, my friends will tell you, I would actually, I publicly announced when I grow older, I wanna be like Ali.
[91] He was a very unusual being.
[92] And he spoke very little, very, very little.
[93] He was either laughing all the time and being silly and goofy, or when you asked him a serious question, he would stay silent and then speak eight words, okay?
[94] And those eight words would literally reshape your word, Okay.
[95] And I noticed that at a very young age for him.
[96] And so I started to consult with him on a lot of topics, on a lot of topics.
[97] And on happiness specifically, together we ended up with a model that worked.
[98] You know, we had the happiness equation.
[99] We had the happiness model.
[100] And it worked.
[101] And it worked so well that when we lost him, sadly, when he was 21, my intention shifted from spending the rest of my life in grief to actually writing what he taught me so that I can share it with.
[102] the world and that basically determined the next life for me after the life of the executive and the chief business officer and stock options and luxury and the cars that second life really was the result of his departure and during during that period the inspiration that inspired you to write the book at the very beginning and go on that journey to really find the answer to happiness you said you were clinically depressed yes now for people that don't know what that means practically can you give a description nothing would make me happy and you can you can literally you know as as i i interviewed ruby wax on my on my on my podcast on slow mo and ruby was known for her sometimes you know depression and sometimes teaching and she would describe it as they cut your head off and fill you with concrete when you're depressed you're unable to do anything you're unable to enjoy anything you're unable to engage right and it comes in in different layers But for me, the challenge was I was so successful.
[103] I was so successful.
[104] I literally could print money on demand.
[105] I mean, there were times when my wife would say, can we change the car?
[106] And I would say, so what would you like?
[107] And she would say, okay, wait until Wednesday.
[108] And I would be on the stock market, you know, trading for a few days and making money.
[109] Right.
[110] It was so crazy because of my math skills before the age of machine trading was really as entrenched as it is today.
[111] I could make money on demand.
[112] And yet I poured that money on my life and I couldn't find happiness.
[113] And that really shakes you because now you can get the vacations they're talking about, you can wear what they're, they told you was gonna make you happy, you can buy the things that they told you are gonna make you happy, but nothing's making you happy.
[114] And then it started to reflect on my family.
[115] And I remember vividly the turning point was a Saturday morning when, you know, my daughter, Ali, my son was that little then monk.
[116] My daughter is life at life itself.
[117] She truly is pure joy.
[118] Okay.
[119] And she was, you know, it was a Saturday morning.
[120] She's jumping up and down in joy saying, oh, mommy said we're going to go there.
[121] Can we stop and get, you know, ice cream on the way?
[122] Can we do that?
[123] And she's so happy.
[124] And I was doing whatever busy people do.
[125] reading an email or whatever crap.
[126] And basically I looked at her slowly raising my head in grumpiness and said, can we please be serious for a minute?
[127] Okay.
[128] What's serious?
[129] She was five.
[130] And I could see with my own eyes as my daughter's beautiful daughter's heart broke.
[131] Okay.
[132] And I think to me that was basically the moment where I said, I can't live with this person anymore.
[133] I can't live with me. And when you see that, you make that choice.
[134] And sadly, most people who are successful like your own, audiences, actually wait until that moment happens.
[135] When they're old, when they've gone to the point where the good days have passed, the days where you could have actually built that connection with your family or with, you know, enjoyed your life a little more, are behind you.
[136] And then they wake up.
[137] I was so lucky that I woke up when I was 29.
[138] And so you have this idea, as you say there, to write this book and to answer this question that has become so relevant.
[139] and important to answer in your life.
[140] And as you're on that journey and consulting with Ali, Ali passes.
[141] Yes.
[142] No, so Ali, Ali left after he trained me well enough.
[143] So it's really interesting.
[144] I mean, he, so I started my research maybe when he was six or, yeah, he was six, seven.
[145] And finished when he was 18, 19.
[146] And you couldn't dent my happiness then.
[147] I was the example of happiness.
[148] I mean, I'm a Middle Eastern, and at the time I used to work at Microsoft, at least through that journey, I used to work at Microsoft, and Microsoft's office was in Seattle.
[149] So I would fly every month for a week to Seattle from Dubai to JFK and then from JFK to Washington.
[150] And every time I landed in JFK, I got that random security check where they give me a red envelope and take me to Homeland Security.
[151] It was really not the kindest of treatment, if you want, with a stupid smile on my face.
[152] Like I flew 12 hours, then I stood in line for an hour and a bit, and then they gave me that envelope, and there is a guard walking next to me, now assuming I'm a criminal until proven otherwise, and they sit me in that room, and I have that stupid smile on my face.
[153] Nothing could dent my happiness, okay?
[154] I have beautiful thoughts inside me, have compassion for every one of those officers, you know, that they're just doing their job and of course they're worried about their country.
[155] And it's really weird, to the point that I did this 37 times in a row, okay?
[156] And to the point that I would walk into the Homeland Security Office and the officers behind the counter would go like Mr. Gates is back.
[157] Okay, they know I'm the guy from Microsoft.
[158] I've been there last month.
[159] I would walk to the counter and they say, answer the same 10 questions you asked you last time.
[160] So I would say, this is my name, this is my mother's name, right?
[161] And go through them one by one.
[162] without a dent in my happiness.
[163] But then life tests you.
[164] So, and by the way, I mean, we can talk about this, but of course you can feel unhappy, but I found a way to always come back to happiness if you want.
[165] And then life tests you, and I think life nudges you.
[166] It seems that's all for happy needed to be written, okay?
[167] And I had the notes for it in 2011, but hey, chief business officer of Google X, busy, busy, busy, busy.
[168] And I kept delaying it and delaying it and delaying it until Ali basically came to visit us in Dubai, 2014.
[169] And he was diagnosed with a very simple appendix inflammation.
[170] And, yeah, you know, it's literally the simplest surgical operation not known to humankind.
[171] It's literally a four, five minute thing.
[172] But five mistakes happened, five in a row.
[173] everyone is prevent every one of them was preventable every one of them was fixable but five in a row fixed wrong and four hours later ali was gone yeah i mean it's easy to understand how it feels even today seven years later losing a child is is just the hardest thing ever uh at least for me it's the hardest thing ever like if life had taken all my money and all what i've achieved and you know made me homeless it would have probably felt less painful than losing him but but our reaction was very, very unusual.
[174] Instead of trying to, you know, fight with life, I simply said, okay, you know, he's gone.
[175] There's nothing I can do to bring him back, but I can make his essence alive.
[176] I can keep his essence in this world.
[177] And his essence to me was what he taught me. He saved my life with what he taught me about happiness.
[178] And so I sat down to write first time that I really write in English, is not my first language, I wrote for four and a half months straight.
[179] And, you know, if you've read the alchemist by Paulo Coelho, you know, when you know your life's purpose, the universe conspires to make it happen, I just can't tell you what happened since then.
[180] I mean, every part of the universe is just pushing for this mission to work, you know, from finding my agent at a time where he was actually not feeling great about his life.
[181] So he kept saying, can you send me another chapter?
[182] Can you sent the whole book saying, what's he going to do with this?
[183] And then literally we meet and he basically says, can I please represent you?
[184] We go out, meet 17 publishers within a week in New York in the capital of publishing.
[185] And then things roll and roll and roll to the point that I came here in the UK literally a week after the publication of Soul for Happy.
[186] And I had that interview very famous with Channel 4 News, which within three days, was the highest -watched news clip on the history of Channel 4, to the point that the CEO is starting to wonder, like, I've been broadcasting violence and war for the, you know, last, I don't know how many years.
[187] And my highest -watched clip, at the time it was 37 million views, is about happiness, okay?
[188] Which obviously is understandable.
[189] It is the pandemic of our time.
[190] Within three days after that, we were watched 87 million times, okay, more than double the highest.
[191] And the movement was starting, the one billion happy movement was starting.
[192] Basically, I think it was a very strong confirmation to the world that this is something the world needs.
[193] And you can actually feel today that there is a shift, not because of me only, but because there are so many people coming into this.
[194] There is a big shift now between employee satisfaction and employee happiness, between, you know, let's just work on mental health and actually let's work on happiness and so on and so forth.
[195] It seems that the world is getting it, that, you know, we're not supposed to be grinding ourselves and giving away our lives for things that we think are going to make us happy.
[196] We might as well be happy and get everything as a result because we can then be successful, we can be engaged, we can be lovable, we can be, you know, supported and so on and so forth.
[197] It's just remarkable in my mind that you can lose the most important thing to you, as you've described it in your life, your son, to human error and still not fall into resentment or bitterness or regretting.
[198] What would it do?
[199] What would it do?
[200] I mean, of course.
[201] I mean, I took steps to make sure that things are corrected so that no one else gets hurt, right?
[202] But what would it do?
[203] I mean, I was very prominent at the time.
[204] When Ali died, I was chief business officer of Google X, but I was still between Dubai and California.
[205] So I spent half of my time in Dubai.
[206] And it was after seven years of being vice president of emerging markets for Google.
[207] So I had opened half of Google's offices globally.
[208] I was very, you know, well connected to the business leaders and government leaders in Dubai.
[209] And so when Ali died, we got a call from the top of the Ministry of Health saying we heard what happened.
[210] I'm so sorry, Mo. Would you mind if we perform an autopsy on Ali's body to get to the bottom of this?
[211] So I looked at his mother sitting next to me, most wonderful woman on the planet with her eyes teary.
[212] And I said, Nibel, would you mind if they do that?
[213] And she raised his head and said, would it bring Ali back?
[214] And that one sentence anchored us in the truth.
[215] You see, the problem with grief is that the cycle of grief takes you five steps.
[216] The very last step is acceptance.
[217] Okay.
[218] And that step of acceptance could take you 70 years sometimes.
[219] For us, the truth was glaringly obvious four hours later.
[220] There's nothing you can do to bring him back.
[221] This is it.
[222] And the finality of death is so corrective of all of our human illusions.
[223] This is it.
[224] He's not coming back.
[225] So what can you do now?
[226] And my brain started to attack me. My brain started to say, you should have, you know, the one thought for the first few days was you should have driven him to another hospital.
[227] You should have driven him.
[228] Until I said to my brain like, okay I wish I could go back and drive him to another hospital I can't so can you please bring me a thought I can act upon okay and so I had a couple of days of silence and then then my wonderful daughter comes to me they were very very close and she said papa Ali had a dream a couple of weeks ago and he called and told me about it and I think it's very relevant you need to know okay and I said what baby and he said and she said he dreamt he was everywhere and part of everyone and that he it felt so amazing that he didn't want to be back in his body and when she told me i still tear up thinking about it today when she told me this my blurry brain could only listen to this is my master giving me my target that's the only thing i heard it's like make me everywhere and part of everyone that's what i heard.
[229] And at the time I was head of Google, I understood billions.
[230] I knew how to get a message to billions of people.
[231] So what did I do?
[232] I literally said out loud, consider it done.
[233] It's done.
[234] Okay.
[235] And I told you when I wrote Soul for Happy, it was the most selfish thing you can ever do.
[236] I wanted the essence of my son to live on.
[237] And so I basically wrote it with the intention of, okay, I'm going to make him everywhere and part of everyone.
[238] I'm just going to spread this beautiful essence to 10 million people and then I don't know 70 years later through six degrees of separation a tiny bit of him will be everywhere and part of everyone that was my blurry brain but maybe it was also life's way of saying share something to something useful enough enough building phones and building you know faster engines and you know maybe the world needs something different maybe maybe share something that actually is needed by humanity Is that where that 10 billion million number came from?
[239] 10 million was the original target, yeah.
[240] 10 million happy was.
[241] And at the time, it felt crazy that we were shooting for 10 million.
[242] But again, with things like Channel 4 alone, I think by week eight, we had reached 137 million people.
[243] Crazy.
[244] Okay.
[245] But we don't measure those, by the way.
[246] We don't measure just the views.
[247] We measure how many people took action in terms of receiving the message.
[248] So it's basically one billion happy today.
[249] is three steps.
[250] Step one is we're going to send you a message that wakes you up that tells you happiness is your birthright and it's highly attainable.
[251] It follows an equation.
[252] So you can actually do certain things and you will be happier, right?
[253] The second, once you get that message, that's not enough.
[254] We count you as one of billion happy if you take one of two actions afterwards.
[255] Either you invest in your own happiness, right?
[256] So you invest in your own happiness by going to another piece of content shifting to, you know, read a book or watch another video or we can see that you're investing in your happiness, asking me a question, getting in touch, whatever that is.
[257] Or you share happiness forward.
[258] And the entire Ponzi scheme, if you want, of one billion happy is built on the idea of we hope that as a small team, within hopefully the next 10, 12 years, we will have, you know, cultivated a million champions that will be able to, you know, make a billion people happy as their own mission and then we will get completely forgotten okay because it's the only way for it to succeed is that it's not counting on you know one person or one face or one team because the team will get dismantled and i'm you know going to disappear and it has to be a movement right and so the whole movement is on that pillar number three pillar number three is you got a message of happiness you you know it might have touched you can you share it to two people and ask them to share it to two people and ask them to share it to two people.
[259] Simple exponential curve, simple Ponzi scheme, really, a positive Ponzi scheme.
[260] Okay.
[261] And it's been working.
[262] We think we're at 51 million, which is not the biggest number, but you understand the law of accelerating returns, right?
[263] So if it's now, you know, if we can do this again in four years and then that becomes a hundred million and then the hundred becomes 200 and so on, you know, who knows?
[264] Who knows?
[265] We may get there.
[266] and the mission is i guess the banner of the mission is so happy right is no no no no no so no actually not at all so books don't go to millions that's the truth of books okay the books allow me the opportunity to sit with you and and spread this to your audience okay but but the mission is much bigger now so the pillars of the mission are there is a tremendous amount of content that i put out there i mean if you search for my name on google you'll have hundreds of hours of videos.
[267] I'm tireless.
[268] I did, yeah.
[269] I watched several hours.
[270] Yeah, and some of them are, you know, Stanford University classrooms and some of them are short conversations with, you know, insightful people like yourself.
[271] And it's, it's, you know, there is hundreds and hundreds and hundreds of content, of hours of content, but that's one side.
[272] The other side, of course, is through that network, I've, of course, connected with the more, with the wisest people on the planet.
[273] So, you know, when I was chief business officer of Google, I would be connected to prime ministers and business owner.
[274] Owners now I'm connected to his holland is the Dalai Lama, the top monks in the world, the top teachers in the world.
[275] And so I brought all of them together on my podcast on slow -mo.
[276] And slow -mo is a very unusual because it comes from a chief business officer, right?
[277] Basically a simple message to say, take a little bit of time to slow down and reflect.
[278] Okay.
[279] And it's not me talking.
[280] it's the wisest people on the planet like I get blown away every time okay and and so that's that's another element the other element of course is training material so we're working on that we have an app coming out in christmas that is actually really promising so we're building an artificial intelligence based happiness assistant which covers a very interesting gap that we're all unhappy but we're not all unhappy for the same reasons and so if i this is dispatch content at you that is irrelevant Okay.
[281] I'm probably going to piss you off rather than make you happy.
[282] So the app, in version one, it's not perfect yet, but in version two, we're aiming to get to the point where we know exactly why you're unhappy.
[283] And so we're able to actually show you the enough learning and enough practice that can allow you to find a path back to happiness.
[284] So it's more intelligent, if you want, not in understanding happiness, but in understanding unhappiness, if you want.
[285] So to understand what happiness is you have to understand the cause of it.
[286] Yes.
[287] And you write about that extensively and solve a happy.
[288] So what is the cause of unhappiness as you see it?
[289] Especially if you're building sort of machine learning applications that are going to, you know, solve, you know, make people arrive at contentment or happiness in a personalized way.
[290] We must be able to know what's causing this lack of happiness.
[291] Allow me a bit of time to explain it because it's simple when we get it, but it's not simple to get to it.
[292] So happiness is very predictable.
[293] Okay.
[294] If you look back at any point in your life where you ever felt happy, there is one commonality across all of those moments that can actually be documented in a mathematical equation.
[295] You've never felt happy because of a specific event in your life.
[296] Take, for example, rain.
[297] Rain doesn't make you happy or unhappy.
[298] There is no inherent value of happiness in rain.
[299] Rain makes you happy when you want to water your plants.
[300] And it makes you unhappy when you want to sunbathe.
[301] Right.
[302] And so it's not just the event, rain.
[303] It's the comparison between the event and an expectation in your mind of how life should be.
[304] Okay.
[305] If you're worried about your plants, then life should be generous to me and get me rain so I can water the plants.
[306] And if life does that, then life meets your expectations and you're happy.
[307] Okay.
[308] And so happiness in that sense becomes equal to or greater than.
[309] So it's really mathematics that your perception of the events of your life.
[310] minus your expectations of how life should be, okay?
[311] And apply that to anything.
[312] Apply that to anything.
[313] So, you know, my favorite example is nature.
[314] We're all happy in nature.
[315] Why are we all happy in nature?
[316] I mean, you go out there and there are ants and there are flies and, you know, trees are crooked and there are, you know, shrubs everywhere and bushes.
[317] And it's just really not that hedged and organized.
[318] But that's what we expect.
[319] So, you know, nature's chaos.
[320] is what we expect nature to be, and so we feel happy.
[321] You know, nobody ever sits in front of the ocean and says, I like the view, but please mute the sound.
[322] Okay?
[323] You just take it, you know, it's the monotonous sound and the view and the wind and the sun and the whole experience, right?
[324] And because of that, happiness becomes very different than what was defined to us.
[325] Okay?
[326] What was defined to us is that happiness is found in gathering at the pub or a party or an activity or some kind of pleasure or fun or elation or whatever that is.
[327] That's not at all true.
[328] I call these the state of escape.
[329] Happiness as per the definition of the happiness equation is events equal to or beating expectations, life going my way.
[330] And so basically happiness is that calm and peacefulness you feel when you're okay with life as it is.
[331] It doesn't really matter what life is.
[332] okay what matters is that you can be okay with it right so so you take you know the any example if your boss is annoying and your expectation is yeah bosses are annoying this is what life is about they become bosses because they're annoying right and and so if if that's your expectation you're going to look at it and go like yeah I need to learn the skill of managing annoying bosses okay and if that's the case then you're not going to be upset about it um Similarly, anything else, if you look at it, then it's not just the event.
[333] It's your perception of the event.
[334] So you have something to influence.
[335] It's not just the event, your partner might say something hurtful on Friday at 4 p .m. That's the event.
[336] My partner said something hurtful.
[337] At Sunday morning, you tell yourself, he or she doesn't love me anymore.
[338] That's your perception of the event.
[339] That's not actually the event.
[340] The event is something hurtful was said.
[341] but your perception of the event is your work, it's your brain adding color to it.
[342] And then you compare that to your expectations, right?
[343] You compare my boss is annoying to my boss shouldn't be annoying.
[344] Where did you get that from?
[345] Right.
[346] So we blur the happiness equation.
[347] We break the happiness equation because of what I call the six and seven.
[348] Okay.
[349] Six grand illusions and seven blind spots, which are the six grand illusions are basically, call them pathways that the modern world teaches us to navigate the modern world that our illusions are not true.
[350] Take, for example, control.
[351] Everyone knows that to succeed in the modern world, you have to learn to control certain events, right?
[352] So you start to believe that the way to succeed in life is to control everything.
[353] But the truth is, even if you go down to the basics of physics, that we never are in control, that the absolute design of nature itself, of the universe itself is entropy and chaos, right?
[354] That's the actual design.
[355] And so if you try to control it, you're bound to be disappointed.
[356] A lot of events are going to miss your expectations.
[357] And yes, I'm not saying don't control anything at all, but start to understand that you're going to be selective because you have a finite amount of effort.
[358] And, by the way, even if you're selective and you try to control everything, sometimes things will fall out of control, okay?
[359] And that should be your expectation.
[360] Once you get that right, that was my biggest illusion, okay?
[361] I'm a mathematician, I'm a software developer, I am a physicist, I am an engineer, and I'm a senior executive.
[362] It doesn't get worse than that, okay?
[363] I'm like the worst, absolute the worst.
[364] I used to give my wonderful wife, I swear to you, Stephen, don't judge me. I used to give her a spreadsheet, that would tell her when to wash the colors and when to wash the whites based on our average consumption as a family to save the environment.
[365] And poor Nibel would actually smile at me and say, sure, baby, I will use this.
[366] Of course, and ignores the hell out of me because that's how crazy you can be when it comes to control.
[367] Now, these are the illusions.
[368] If you live your life through the illusion of control, good luck finding happiness.
[369] So six grand illusions, the illusion of thought, the illusion of self, the illusion of knowledge, the illusion of time, control, and fear.
[370] Okay?
[371] Now, that's one side.
[372] And that disrupts your entire view of what to expect from life because you're expecting life to behave through a length of an illusion.
[373] The other side of it is what I call seven blind spots.
[374] And the seven blind spots are not really defects in your brain.
[375] As a matter of fact, they are the very design of your brain.
[376] okay your brain is designed to tell you what's wrong okay it's not designed to you know if a tiger shows up right here now my brain has no use whatsoever in telling me oh my god look how majestic that animal is right yeah it's a beautiful animal but my brain will say we're gonna die okay and we're gonna die is the idea that basically makes our our brain constantly look for what's wrong blur the events of life, you ask a mother, and she will say, oh, my daughter's been sick all winter.
[377] She just had two episodes of flu, three days each.
[378] But to the caring heart of a mother, that needs to be exaggerated to the exaggeration is one of the blind spots.
[379] Your brain is trying to get you to take action, so it pushes you.
[380] It pushes you by exaggerating the event a little bit so that you jump in and take action.
[381] And accordingly, the event you're comparing to, you're comparing the wrong event to the wrong expectation and the happiness equation falls apart.
[382] Under all of this, you're inferring something which I think will annoy a lot of people.
[383] And that is that happiness is a choice.
[384] Oh, totally.
[385] And that you can choose to be happy.
[386] And that if you're unhappy and really for many circumstances in our life day to day and work and love and relationships, personal responsibility is the answer.
[387] and entirely on you.
[388] And a lack thereof is the cause.
[389] Absolutely.
[390] You know what you just did?
[391] You've just lost us 8 % of the audience.
[392] I know.
[393] Do you know why I know?
[394] Because I did a tweet one day about this.
[395] And what my tweet was, there's like a, I guess a mental model, but there's a reframing that I think has brought me happiness, which is when something happens to me, I used to, like many people say, X thing that happened has pissed me off.
[396] Yeah.
[397] And just by changing that sentence to, I've pissed myself off because of X thing.
[398] Absolutely.
[399] And I tweeted that.
[400] And I was like, try it.
[401] Just like reframe it and take personal responsibility for how you're feeling.
[402] And in the comment section, everyone was like, nope.
[403] Yeah.
[404] People don't like the idea that they have control over their emotional responses.
[405] So when I wrote, so when I wrote, when I write books in general, I write them, I write them like software.
[406] So I issue a beta version, okay, and I get 270 people.
[407] I don't know why 270.
[408] I heard that's fascinating.
[409] Yeah.
[410] I get 270 people.
[411] to read it on Google Docs.
[412] So I give them editor privileges so they can actually edit the text, right?
[413] And then something fascinating happens.
[414] They edit the text and then others edit what they edited.
[415] Okay.
[416] And there is a conversation happening.
[417] And basically it takes the book to its best possible version if you want.
[418] In Sol For Happy, I had a sentence on page 11 that basically said exactly what you said.
[419] Happiness is a choice.
[420] Okay.
[421] And at that, page, I lost 8 % of the readers.
[422] And, you know, I looked into the information that they gave me about themselves, the early readers, and most of the 8 % that left were already in depression.
[423] Okay.
[424] And to tell someone, it's your responsibility to get yourself out of this horrible place that you're in is quite disturbing because we like the idea of saying, no, no, hold on.
[425] No, it's not me. Life is treating me really badly.
[426] That's why I'm not happy.
[427] Okay.
[428] I can, do anything about it.
[429] Life took my son.
[430] You know, life took my son.
[431] I have the right to be unhappy.
[432] Yes, life took your son.
[433] That's true.
[434] And you have the right to be unhappy.
[435] But you're never going to get out of unhappiness if you wait for life to bring him back or you wait for life to correct its action.
[436] Okay.
[437] The only way you can come out of unhappiness is if you choose and say, okay, it's going to be a long journey.
[438] It's going to take a lot of time.
[439] Okay.
[440] And I'm going to try and try and try, but I'll get there.
[441] And neuroplasticity proves that.
[442] Neuroplasticity basically tells you that if you just run a happiness kind of activity once a day, every day your brain will be better at it.
[443] And I mean, please don't get me wrong.
[444] But what do most of us do every day?
[445] We watch negative news.
[446] We swipe on toxic positivity and we're just drowning ourselves in negativity.
[447] And then what happens?
[448] What happens is we become really good at being negative.
[449] We become really good at finding what's wrong with life, become very good at, you know, getting pissed off with the prime minister, right?
[450] Because it's an activity we do on daily basis.
[451] So your brain goes like, this must be important for her or him.
[452] Okay, I'm just going to make sure I have their neurons aligned around that.
[453] And so you're basically, we're basically configuring our brains to be unhappy.
[454] I have not watched a horror movie for 15 years.
[455] Really?
[456] Yeah, you know what that means?
[457] I have not had a nightmare for 15 years.
[458] Not a single one.
[459] I have not watched a violent movie unless really badly recommend it to me because it has a good message in it.
[460] And I watch Michael McIntyre every night before I sleep.
[461] I love Michael McIntyre.
[462] Who's going to get me to say hi to Michael McIntyre?
[463] But think about that practice.
[464] My brain before I go to sleep is laughing.
[465] It's laughing.
[466] That's a choice.
[467] That's a choice.
[468] And that is the kind of neuroplasticity.
[469] that we need to shift.
[470] You know, if you go to the gym and lift weights every day, you're going to look like a triangle.
[471] If you squat every day, you're going to look like a pair.
[472] The same is happening inside your brain.
[473] You just don't see it.
[474] If you're constantly watching, you know, news media, right?
[475] You're literally building your muscles that are concerned and are, you know, critical and are worried about the world when in reality, most of the time you can't do anything about it.
[476] Like, okay, so I'll give you a very strange example.
[477] When I was locked down, first lockdown, I was in London.
[478] Second lockdown, I was in Canada.
[479] Okay?
[480] As the lockdown was approaching, I stopped watching news after April 2020.
[481] Zero news.
[482] Okay?
[483] And by the time I was in Montreal, someone texted me and said, hey, by the way, did you know we're going to Code Red tomorrow?
[484] I said, yeah, what's Code Red?
[485] She said, all restaurants are closed.
[486] You wear a mask everywhere.
[487] I said, good.
[488] That's it.
[489] That's all the news I needed to know.
[490] Really, okay?
[491] People would go like, no, how come?
[492] You need to know the numbers and the statistics and the death rate and they're, no, I don't.
[493] Okay?
[494] Someone else is doing this.
[495] And by the way, if I know it and I don't like it and I don't believe in what they're doing, I'm going to be locked down anyway.
[496] So can I waste my time or actually utilize my time in building a podcast that becomes one of the top half percent of all podcasts globally?
[497] Isn't that a better use of my life than just watching the news and creating that illusion for myself that I can actually influence anything when in reality?
[498] So, you know, I normally advise people and say, look, if you've been following a certain topic for the last two months and have not been able to influence the decision on that topic for the last two months, you're useless.
[499] So stop watching that topic, okay?
[500] And start choosing topics that you can champion, okay, one or two, because you're human, you're not, you know, you're not Superman.
[501] Find one or two real, you know, purposes that you actually care about and try to learn enough about them, enough depth about them to influence them.
[502] That's the way to make the world better.
[503] That's the way to make your life better.
[504] And yeah, climate change is really something very important, but it's not on my agenda.
[505] I don't work on climate change.
[506] I work on happiness.
[507] That's my part of life.
[508] Okay?
[509] Someone else I trust will be working on climate change, which I believe is as important.
[510] if not more important, but it's not mine.
[511] I don't need to watch everything about it, okay, and concern myself about it all the time.
[512] I need to be updated.
[513] I need to do my part by really changing my habits as a human, but that's it.
[514] That's as far as I go.
[515] There's something in there, which is clearly a theme, and I think three topics we've touched on, the passing of your son, you know, you talk there about COVID and other elements, which is this theme of, like, radical acceptance.
[516] Oh, absolutely.
[517] Like instant radical acceptance.
[518] Oh, absolutely.
[519] I mean, this is what I call the Jedi master level of happiness.
[520] So there are three levels of happiness, right?
[521] The, you know, if you really think about it, I call it the happiness flow chart.
[522] Events are going to piss you off.
[523] It's just the truth.
[524] If you can manage to acknowledge your emotion and say, oh, my God, I feel, am I angry?
[525] Is this anger?
[526] I mean, is this what I'm feeling?
[527] And then you take that feeling and you say to yourself, okay, interesting.
[528] I am angry.
[529] I need to do something about it.
[530] I will give you three steps.
[531] Okay.
[532] The beginner's level is ask yourself if what your thinking is true.
[533] Your partner said something hurtful on Friday.
[534] Your thought is he or she doesn't love me anymore.
[535] Okay.
[536] Ask yourself if that thought is true.
[537] If it isn't, drop it.
[538] There is no point to be unhappy.
[539] If it is, then let's go to the black belt level of unhappiness, which is, can I do something about it?
[540] That's the second question.
[541] Is it true as question one?
[542] Can I do something about it?
[543] It's question two, right?
[544] And honestly, by the way, it doesn't take more than two seconds.
[545] To feel the emotion, ask yourself if it's true and then go to say, can I do something about it?
[546] And if yes, then do it.
[547] What are you waiting for?
[548] Text him or text her and say, baby, can we please talk over dinner what you said on Friday hurt me. Okay?
[549] Instead of just banging your head against the table, hoping that they will find out and come and say, oh, I'm so sorry.
[550] You know, I was teaching, this story really hurts me. I was teaching, you know, before lockdown, I taught a lot of people in workshops and seminars, more than 20 ,000 people.
[551] One day, one of them comes to me in the first break and says, what are you talking about?
[552] What do you mean happiness as a choice?
[553] You have no idea what happened to me. Okay?
[554] And I said, okay, and she said, when I was 17, she was 74 at the time.
[555] can you believe that 57 years of holding on to one thought hitting her head against the wall right and I hugged her I hugged her I cried and I said did it work did all of that work or was the better thought okay it was horrible but can I do something about it and that's question number two that's black belt sometimes however there's nothing you can do about it whatever she experienced could be irreversible.
[556] What I have experienced, the loss of Ali, is irreversible.
[557] There is nothing you can do about it.
[558] And I'm not asking everyone to get there quickly, but the Jedi master level of happiness is to say, okay, it happened, and I have no choice to change it.
[559] There is nothing I can do to fix it.
[560] So can I accept it, but not surrender and lie down and die, accept it and then start to do something to make my life better despite its present.
[561] or maybe because of its presence, okay?
[562] Can I accept that Ali died and start to spread his message so that my life and the life of others become better?
[563] Can I accept that I'm locked down and start my podcast so that I can use the time where I'm not traveling?
[564] Can I do that?
[565] I call that committed acceptance, okay?
[566] And it's very simple.
[567] If you commit and accept to, if you accept things you can't change and commit to make your life better despite of or because of their presence, Nothing can beat you.
[568] Nothing can beat you.
[569] And yeah, is it horrible that I actually managed to move on and, you know, not hit my head against the wall for 27 years?
[570] Does that say I don't love Ali?
[571] What are you talking about?
[572] I adore Ali.
[573] I cry about missing him still today, right?
[574] It's not that it's, there is nothing to prove in that.
[575] What I can prove is I love him so much that I actually, dedicate my life to spreading his message.
[576] That's so much better than sitting there and saying, ah, life hit me. I don't like life.
[577] That's a six -year -old attitude, honestly.
[578] Adults will say, okay, and especially business people, I mean, your audiences, the market changes all the time.
[579] Do you sit down and go like, I lost another deal, or do you just get up and say, why did we lose this deal?
[580] What can we do about it?
[581] And if there is something wrong with the product, can we change the product, right?
[582] Well, you talk to there about business in particular rings very, very true because in business and you've been very successful entrepreneur yourself and worked with teams, you'll get people who are high in defaulting to logic in moments of chaos and also default to personal responsibility and those that don't.
[583] Yeah.
[584] And the outcomes of both groups are quite predictable.
[585] Very different.
[586] And actually, this approach of is it true?
[587] true, can I do something about it?
[588] Can I accept it and commit?
[589] I learned that in business.
[590] So I've spent most of my career, I was managing managers.
[591] And what do managers do?
[592] They open your door and they sit down and complain.
[593] And after a while, it becomes too much.
[594] So my attitude was very straightforward.
[595] I would give them 10 minutes to vent.
[596] Then 10 minutes to ask them, is this true?
[597] Okay.
[598] Is there anything you're missing?
[599] Is the legal team also nice, not just making your life miserable, right?
[600] Have you seen evidence that they've helped you before?
[601] So, you know, is it true?
[602] And then I go like, so now, great, last 10 minutes of the meeting.
[603] What are we going to do about it?
[604] Are we going to be able to improve it, fix it, or are we going to accept it and do something despite its presence?
[605] And it's a very simple business approach.
[606] Now, most of us do that in business.
[607] But when it comes to our personal life, we don't do that.
[608] And interestingly, most of us, by the way, who do that in business are very successful in business and most of us who do that in life are very successful in life.
[609] It's not just happy.
[610] It makes us successful because it doesn't waste our cycles on things that are not necessary.
[611] So if you can do it at work, do it at home, do it in your life, do it in your relationships.
[612] It's really a very straightforward to flow chart.
[613] You talked about, when you were talking about the chatter that arrived after Ali's passing and it was telling you maybe you should have driven him to another hospital.
[614] Maybe you should have done this and you could have done this differently.
[615] That is, you know, everybody has that chatter show up in their minds at certain points, which seems to be, you know, not necessarily your best friend.
[616] And it's sometimes suggesting that you should do X, Y, and Z, which would probably be destructive.
[617] What you said following that is that you almost like disassociated from it.
[618] And it was, it wasn't you, you were almost describing it as if it was someone else in your head.
[619] Absolutely.
[620] I call my brain Becky, yeah.
[621] You call your brain Becky?
[622] Yes, yeah, yeah.
[623] So, Becky is a third party.
[624] Yeah, yeah, yeah.
[625] Becky's not me. So think about it, huh?
[626] It doesn't take a lot of logic.
[627] Again, it's one of the illusions of the modern world.
[628] The illusions of the modern world, basically we glorify thinking so much that we think that the voice inside our heads telling us what to do is us telling us what to do.
[629] If it was you telling you what to do, why would it need to talk?
[630] And I think really you need to think about this.
[631] And there has been, you know, research in this since the 1920s.
[632] And Liv Gidov, Gidovsky, I think his name, is a Russian Nobel Prize winner in the 1920s, basically won the Nobel Prize because he observed that the voice box when you have that internal dialogue in your head is moving ever so slightly like it does when you speak out loud.
[633] Okay.
[634] And so MIT proved that actually 2007, there was a wonderful MRI study where they put participants in MRI machines, give them word puzzles, and the participant problem -solving areas of the brain would light up for as long as it takes to actually solve the problem, and then that would shut down, so no more problem solving, but the participant's still not aware of the answer, and the speech association area of the brain would actually light up for up to eight seconds, and then you would know the answer.
[635] Then the participant would know the answer.
[636] Okay, so literally your brain solves the problem, and then takes up to eight seconds to turn it into words to tell it to you.
[637] Your brain is literally talking to you.
[638] It's not, I think, therefore, I am.
[639] It's I am, therefore, my brain thinks.
[640] Now, the interesting challenge we have in the modern world is this.
[641] Nobody wakes up in the morning and tells themselves, I pump blood around my body, therefore I am.
[642] The biological function of your heart is to pump blood around your body.
[643] yet we think I think therefore I am I am that voice in my head now you know simply if you realize that this is just a biological organ okay and the biological product of your brain is thoughts and and the currency of the brain is words because the only building blocks of knowledge you have since you started to speak is words okay and so accordingly what your brain is doing is it's analyzing it's, you know, it's the world around it and presenting ideas so that you can choose.
[644] Now, if you think that those ideas are you telling you what to do, then you're going to obey.
[645] Okay?
[646] If it says, oh, life is miserable, then it must be true.
[647] Life must be miserable.
[648] But that's not the truth at all.
[649] If me and Becky are two different people, I can debate what Becky is telling me. I can refuse to obey what Becky is telling me to do, and I can tell Becky to shut the F up.
[650] Do you understand that?
[651] And I actually do it very often.
[652] I'm like, I'm working on something and Becky comes up with an idea, oh, my, your daughter doesn't love you anymore.
[653] I'm like, Becky, we're going to talk about this at six.
[654] It's as simple as that, right?
[655] And your brain does what you tell it to do.
[656] I mean, anyone listening to us, if you tell your brain, raise my right hand, your brain is not going to raise your left foot.
[657] It's just going to obey.
[658] Just tell it.
[659] You're the boss.
[660] Okay.
[661] So when your brain poisons you with all of those thoughts, follow the flow chart.
[662] Okay, Becky.
[663] vent a little bit, then tell me, is this true?
[664] Is there something I can do about it?
[665] Can we accept it and do something despite its presence?
[666] How difficult is that?
[667] And so most happiness practitioners I interviewed my dear, dear friend Matthew Ricard on slow -mo.
[668] Matthew Ricard is known as the world's happiest man, so 63 ,000 hours of lifetime meditation.
[669] His brain circuitry is literally different than ours.
[670] And I asked him and I said, Matthew, so, do you sometimes get unhappy?
[671] And he laughed and in his funny French accent said, what are you talking about?
[672] No, I'm pissed off all the time.
[673] Okay?
[674] And basically all happiness experts will tell you unhappiness is a survival mechanism.
[675] It's alerting you.
[676] Your brain is saying, hey, something is not perfect.
[677] Can you please look into this?
[678] Right?
[679] The game is not to avoid that.
[680] That's actually harmful for you.
[681] You want to be aware of the things that might go wrong.
[682] Okay?
[683] The game is how quickly do you bounce back to happiness?
[684] From that moment where your brain says something is not right, how quickly do I go back to happiness?
[685] And I don't brag about this, but I say to encourage people.
[686] If I'm allowed to teach people about happiness, I need to be the Olympic champion of the sport.
[687] So I promise you, and I'm not bragging.
[688] On average, it takes me seven seconds.
[689] From the time my brain suggests that something deserves my happiness, to the time I either dismiss it because it's not true, or decide what I'm going to do about it or decide to accept it and think what else I'm going to do is seven seconds, seven seconds, okay?
[690] Yes, sometimes I get stuck in, you know, maybe three, four times a year, I get stuck in something that takes me, you know, a day to overcome.
[691] But most of the time, it's a very simple flow chart.
[692] It's a very logical process.
[693] And you've had to train yourself to get to that point.
[694] It's neuroplasticity.
[695] It is new.
[696] Go to the gym.
[697] Okay.
[698] Write the flow chart on a piece of paper and every time your emotion changes, look at it.
[699] Is it true?
[700] Can I do something about it?
[701] Can I accept it and do something despite its presence?
[702] It's really that simple.
[703] And you talk a lot about one of the illusions you said that there was time.
[704] Oh, I love time.
[705] And the importance of presence.
[706] What role does time in being present?
[707] Because also you talked about that, the world's happiest man, as you described him, being a great meditator.
[708] And I think from what I understand about meditation, although I'm not an expert, much of that is about bringing us to the present moment.
[709] Totally.
[710] I mean, there are two sides to time that we need to understand.
[711] One you can easily understand from theory of relativity and Einstein's view of space time.
[712] Anything you know about time is not real.
[713] As a matter of fact, nobody has a clue what time is.
[714] Okay.
[715] And we have to accept this, that the illusion of time in the modern world is because we've managed to control what we've measured.
[716] We're measuring mechanical movements that sort of hint to time, to the passage of time and we now can show up on time and, you know, have an interview that we can measure is like an hour and a bit and so on and so forth.
[717] But time itself, we don't know, okay?
[718] The only understanding we have of time is that we're being propelled forward through space time along the arrow of time, okay?
[719] And that every time, every slice of space time, we're standing there in that slice only living here and now, okay?
[720] You've never, ever lived yesterday.
[721] Do you realize that?
[722] When you lived yesterday, it was called today.
[723] You're never going to live tomorrow.
[724] When you live tomorrow, it's going to be called today.
[725] Okay.
[726] It's always right here and right now.
[727] And I did an interesting analysis and solved for happy where I listed down the majority of human emotions and plotted them across where they are anchored in time and across if they're positive or negative.
[728] Okay.
[729] So you take an emotion like regret.
[730] Regret is anchored in the past.
[731] It's about something that happened in the past and it's negative.
[732] You take something like anxiety.
[733] Anxiety is anchored in the future.
[734] It's about something that might happen in the future and it's negative.
[735] The majority of negative emotions are anchored in the past and the future.
[736] The majority of positive emotions are anchored in the present moment.
[737] If you're here and now, there's absolutely nothing wrong.
[738] I mean, think about it this way.
[739] If you're listening to us having this conversation as one of our listeners, by definition, there is no tiger trying to eat you.
[740] You know, the reality, which is really shocking, the reality that you can feel unhappy about something in the past and the future is in itself evidence that now is okay.
[741] Because if there was a tiger trying to eat you right now, you wouldn't be thinking about losing your job in three weeks.
[742] So the truth is every time, remind yourself that the fact that I'm thinking about past and future is itself evidence that now is fine.
[743] There is a roof on top of my head.
[744] I'm obviously not starving.
[745] You have an electronic device that allows you to listen to us.
[746] Life is okay.
[747] That's so crazy.
[748] I've never actually thought about that idea of practically of what you just said then, of that the person listening to this now focusing on the sound of my voice is not actually unhappy now.
[749] Absolutely not.
[750] They're not.
[751] When they stop listening to it, their thoughts might descend into unhappiness past future regret whatever but as they're listening to this they're not actually in the state of unhappiness they have to stop listening and stop engaging to create room yeah to create space for that remember inception yeah yeah yeah the beginning of inception the question was what is the most deadly parasite or whatever i don't remember and he said it's an idea an idea you have never there is nothing in your life that has the power to make you unhappy until you turn it into a thought, a negative thought, and turn it into your head and torture yourself with it.
[752] Simple as that.
[753] If you're locked down and you're at home and you have food and you have shelter and you're not sick, and hopefully none of your family and loved ones is sick, okay?
[754] The only way you can get yourself to be unhappy is to say, I don't like this.
[755] I want life to be different, right?
[756] And that's not going to change life, interestingly, that thought, right?
[757] But it's going to change you and make you miserable.
[758] It's the only way you can make yourself unhappy.
[759] You just lost another 8 % of Alice.
[760] Yeah.
[761] You have a, yeah, hopefully we're getting 8 % more somewhere else.
[762] You have a tattoo on your back?
[763] I don't.
[764] Oh, you don't.
[765] Oh, Ali did.
[766] Was it, Ali, about, oh, yeah, the gravity of the battle.
[767] gravity of the battle.
[768] Could you explain that to me?
[769] I found that really amazing.
[770] It's the last thing he told me. Can you believe that?
[771] So Ali, Ali had a tattoo on his back that read the gravity of the battle means nothing to those at peace.
[772] Okay?
[773] And you would wonder why?
[774] Because he's, you know, he lived the life of ease and luxury in general.
[775] I mean, even though he always forced himself to go to the, I mean, he had those journeys where he would go and literally walk across the villages of America with no money just to live the life of the people, the real people, if you want, he would go to, you know, it was very unusual.
[776] Anyway, yeah, his tattoo said the gravity of the battle means nothing to those at peace.
[777] And it was the very last thing he told me because basically he had, he was wearing those scrubs and you know how the scrubs are open at the back and he was lying on the operating table and then they were fixing something so he had to sit up and I could read the tattoo was the very last thing he told me before he went into the operating room and yeah it's quite interesting when you really think about it of course life is full of battles life is not supposed to be easy just understand that think of life as a video game if it was easy it would be boring like hell and you would learn nothing at all life is supposed to have a few difficulties on the way but some battles are much harder for you than they are for me and many battles don't even shake those who are at peace right and and the question is how do you find that peace how do you find that feeling of it's okay it's just another battle you know i've i've won every other battle so far and the ones that i've lost or the best thing that ever happened to me. How can you get that straight in your mind?
[778] Because then suddenly when the next battle approaches, and I promise you there will be a next battle approaching, okay, you take that battle with complete peacefulness.
[779] Basically treat it as one more twist on the game controller where you can actually affect your life and make it better and then wait for the next one.
[780] and then another twist on the game controller, and you can do better.
[781] It's a very stoic approach to life, but it's so spot on when you think about it.
[782] I was reading something you said, which really did make me pause for a second, and it said that, you know, correct me where I'm wrong, because I don't know if mischaracterized what you said, but you were saying how basically nobody regrets their battles.
[783] Like, basically nobody would reverse history and undo the hard thing they went through.
[784] And I sat and reflected, and I thought, I thought, all my guests that come here and sit in that chair, and all the things they've told me about.
[785] And I thought, I think you're probably right.
[786] 0 .1%.
[787] So I ran an experiment.
[788] So in the chapter about control, I wrote something that I called the Eraser Test.
[789] The Eraser Test is a thought experiment.
[790] At Google X, we've developed something that can pinpoint a memory in your life and go back to that event and erase that event from your timeline, okay?
[791] Not the memory of it, the actual event.
[792] It will erase the whole thing.
[793] And I ran that experiment with maybe 12 ,000 people, where I basically tell them, first, write down an event that's traumatic in your life.
[794] Second, make a choice.
[795] Do you want me to erase it or not?
[796] Third, be aware that if you erase it, you're going to erase everything that happened as a result.
[797] Every friend that you met as a result, every learning that you had as a result, every resilience that you developed as a result.
[798] Would you still erase it?
[799] 99 .99 % of people said no. I'd keep it.
[800] Okay.
[801] And these were very traumatic events.
[802] It's not just some bully at school.
[803] So, you know, I will tell you openly myself, I cried on stage in 2019 because someone asked me and said, now that you have the eraser test, would you erase the death of Ali?
[804] And I wouldn't.
[805] I wouldn't.
[806] Because I will tell you openly, if you know my son, if I had told Ali before he died that his death would make 51 million people happy, you would have.
[807] said kill me right now and I don't know I mean is it radical acceptance or am I is my brain telling me this but most of the time most of the time the person that you are is the result of those moments it's not the result of the easy parts and you would never erase it 99 .99 % of the people would not erase it easier said in hindsight I guess of course but then but then but then but then let's extrapolate.
[808] If I would not erase 99 .99 % of the harsh events in my past, why am I thinking that this one is the one that's going to stick?
[809] Think about it.
[810] If all of your past's harsh experiences were painful then, but enormously important now, then maybe this one too will be enormously important.
[811] Maybe this is the one that's going to make you who you are.
[812] It reminds me actually of something you said about death as well.
[813] because much of the reason, I think, again, correct me if I'm mischaracterizing, but you said that people fear death because of the uncertainty it brings and not knowing what that, you know, the life after death or that, you know, that will look and feel like.
[814] And I reflect on what we're talking about with this eraser and so I think a lot of the reason why I might not choose to erase traumatic events in my life or the worst things that happened to me is because then I don't know which way that kind of my life would have gone then and it could have gone in a worse direction.
[815] So in this current moment, I feel somewhat content.
[816] You know, I'm just being, I'm talking about the potential, where those 99 % of people that you described.
[817] And there's a chance it could be worse if I use that eraser.
[818] So I'm not going to use that eraser because.
[819] Correct.
[820] Yeah.
[821] Correct.
[822] And most of the time, yes, when you look back in high insight, you start to recognize all of the benefits that came with the trauma.
[823] And most of the time, interestingly, the way the physics of life work, is that the benefits outweigh the tron.
[824] Interesting.
[825] I wrote a book called Happy Sexy Millionaire.
[826] Nice.
[827] Because I was an 18 -year -old kid that, because of all the insecurities from my childhood, being the new black kid in a all -white school, parents were broke, but everyone else around me was rich.
[828] So it creates this cauldron of like insecurity where you want to be, you want to fit in, and then it leads you to the path of thinking that money and material possessions will be the thing that makes me fit in.
[829] So I go off on the path to try and be this happy sexy millionaire.
[830] of course, 25 years old.
[831] Ranger of a sports sat outside.
[832] I'm a multi -millioner, big successful business, six -pack, all these things.
[833] The day our company, IPOs, it's where 300 million?
[834] And just this total anti -climax, which almost suddenly into like, and I'm like, where is the marching band and the confetti?
[835] Like, where is it?
[836] 18 -year -olds, he promised me that.
[837] And so now I reflect, I look forward and think, well, I need to be careful about some of these ambitions I have because I don't know whether that's the insecurity defining the path, or if it's my sort of intrinsic, these are intrinsic things that will make me feel content.
[838] So my question is about how do I know if my ambitions, how do I reframe them now to make sure that they are leading me to a happy place or a fulfilled place and not just scratching some unscratchable insecurity?
[839] That's so interesting.
[840] Did the sexy girlfriend make you happy or make you miserable?
[841] Miserable.
[842] The sexy one did, maybe miserable.
[843] But the one that wasn't so wasn't, didn't care about being sexy and cared about other things, you know, had those good values, made me much happier.
[844] Did the range rover break down any time?
[845] And I know, yeah, a couple of times.
[846] It got smashed up.
[847] People broke into it all the time.
[848] There you go.
[849] It cost a lot of money, hard to park.
[850] So I think we need to differentiate between two sides.
[851] One is ambition and the other is expectation.
[852] Okay.
[853] So have any ambition you want.
[854] Any ambition you want, hopefully a good ambition, be a good, a good billionaire.
[855] okay so my dream is that by the end of my life i will have lost all of the money i made not taking it to the grave anyway and made a billion people happy that's a very very interesting definition of wealth okay have one of those or have any definition you want any ambition you want but have the right expectations there is a difference between ambition and expectations ambition is what gets us to strive and strive in life and go further and have an impact Great.
[856] Set as many of those as you want.
[857] When we achieved 10 million happy, we set a billion happy.
[858] Okay.
[859] But don't get me wrong.
[860] It took Jesus 2 ,000 years to get to a billion people.
[861] I'm not going to get there.
[862] Let's just be very clear.
[863] The expectation is clear.
[864] My expectation is my best dream is that I will energize enough people to take the mission forward.
[865] Okay.
[866] And that's my expectation.
[867] So you know what my expectation is today?
[868] My ambition is a billion happy.
[869] my expectation is that those listening to us are happy okay and if that fails that you are happy that's good enough that's an amazing day right and once you set your expectations right nothing can dend your unhappiness don't don't get me wrong i wake up every morning and i go like what are we going to do today can we reach you know a hundred thousand people today is there a piece of content we can develop can we do this can we can i write another book i don't know and i'm constantly engaged that's my ambition There are days where I wake up and nothing happens.
[870] Great.
[871] That exists, part of life.
[872] When you differentiate those two, everything becomes okay.
[873] Now, I would also say when you're setting your ambitions, avoid junk food.
[874] Avoid the stuff that was promised to you to make you happy before and failed to make you happy then.
[875] Okay?
[876] I know that because I had 16 cars in my garage.
[877] The reason I had 16 cars in my garage, was because I thought the first one would make me happy and it didn't.
[878] So I told myself, it's the color.
[879] I should have taken another color, right?
[880] So I bought another one.
[881] Oh, it's the model.
[882] Okay?
[883] And then I was like, no, but I don't have a fast car.
[884] Maybe I need a fast car will make me happy.
[885] You know, a vintage car will make me happy.
[886] And you know what happens?
[887] Every time the promise is missed, you go like, oh, no, hold on.
[888] Maybe more or different is going to make me happy.
[889] Wake up.
[890] You're a smart person.
[891] Those things don't make you happy, okay?
[892] And just measure, look back in your life and find the actual moments that made you happy.
[893] You know, I have a practice that I call the happy list.
[894] And on the happy list, I say, write down as many answers as you can to the statement that starts with, I feel happy when.
[895] Okay?
[896] Nobody ever wrote, I feel happy when I buy a Ferrari.
[897] Okay?
[898] Yeah, you get that for a couple of moments.
[899] Nobody says, I feel happy when I win the Nobel Prize.
[900] people say I feel happy when my daughter smiles I feel happy when I have a good cup of coffee I feel happy when I have a connected conversation when I learn something new all of them accessible all of them that things that you can introduce in your life today or this week at most and yet we don't do any of them because we want to buy the Ferrari made me miserable I swear to you I think Ferrari is going to sue me it made me miserable okay it always broke down it was so noisy It was so noisy And it didn't meet my character I'm a simple guy I don't want to be looked at in the streets Right And it made me miserable And yet I tried And you know what's the funny thing The funniest thing is that I swear to you I would take one of those cars out And two minutes into the drive I wouldn't remember which one it is Because when you're driving What are you looking at The road Okay Especially when it was dark and at night You just don't see the car anymore It's like Something's taking you somewhere right and yet i keep trying it's so stupid gratitude oh yes the theme that came to mind when you were speaking then because some people think and i was definitely one of those that the way to get have more in your life is to go and buy more but in fact it's that it came to and this is one of the conclusive points in my book was that you can create so much more with gratitude from what you already have four dollar t -shirts absolutely love them make my life so easy.
[901] I don't really have to iron them.
[902] I don't have to, you know, worry about when I go on a date, the first thing I say is that this you're going to see every time.
[903] I hope you're going to find other qualities in me. And it's very open.
[904] And if she doesn't like that and she wants the Armani suit, she's not for me. It's very good.
[905] Okay.
[906] Now, the game is this.
[907] Gratitude is the ultimate solution to the happiness equation because it doesn't only remind you that the event is meeting expectation.
[908] It tells you that the event is so much better than expectation that you're grateful for it.
[909] Okay.
[910] And it does something else that's amazing.
[911] It's neuroplasticity at its best.
[912] It basically tells you, okay, brain, I know you're grumpy.
[913] I know you want to tell me the seven things that went wrong today, but your task right now is to go and find the thing that you're grateful for, something that went so well that you're happy with it.
[914] Go brain, do your work.
[915] And if you do that every day, hopefully several times a day, suddenly your brain goes like, oh, when I was searching for that thing, you asked me that I was grateful for, I found three other things I'm grateful for because I'm getting really good at it because life is full of blessings, right?
[916] It is the absolute answer.
[917] And you know what's the most interesting part of gratitude?
[918] Part of my soul for happy, I talk about a concept I call look down.
[919] Okay?
[920] And look down is the idea that, If I compare to the guy that had 17 cars, I'd feel miserable.
[921] Okay?
[922] If a model compares to the supermodel, she would feel miserable.
[923] Okay?
[924] If you compare to the guy in Africa or the guy in India or the guy in Afghanistan or the refugee camp, you would actually suddenly realize, oh my God, I'm so blessed.
[925] One of the most interesting statistics is the Nordic countries.
[926] They measure something called subjective well -being.
[927] Subjective well -being, basically, is the quality of your life.
[928] They have the highest subjective well -being on the planet.
[929] And yet, they have some of the highest suicide rates.
[930] Why?
[931] Because as the quality of our life increases, we keep looking up.
[932] We keep raising our expectations.
[933] Like we have a service -level agreement with life.
[934] Now that the government can give me, you know, health care and pension and so on, then maybe my girlfriend shouldn't annoy me either.
[935] Where do you get that from?
[936] Because show me their contract, right?
[937] And suddenly it's just constantly resetting.
[938] If you look down, if you look down, I promise you it's going to take you a very long time to get to the bottom.
[939] There are so many people in the world that are so much less fortunate than you are.
[940] It's almost arrogant and so sorry to say stupid to not recognize that.
[941] Okay.
[942] To not, if you live in the UK, by definition, you're one of the luckiest 10 % alive.
[943] It's as simple as that.
[944] Okay, at least most, the majority of the people on the UK.
[945] And by the way, if you're listening to Stephen, then I can guarantee you you are.
[946] It's so true.
[947] Do you know what?
[948] I've never really, so I thought this idea of practicing gratitude was kind of some like fluffy, airy, fairy thing.
[949] And then over the last, I'd say three years when I really, I reflected on the moments when I'm just alone and I get overwhelmed to the point where it's like slightly emotional.
[950] Oh, yeah.
[951] Usually sometimes by music or whatever it might be, I can genuinely make.
[952] myself overwhelmed to almost the point of tears with gratitude.
[953] Absolutely.
[954] And I did it last night.
[955] I was in my bedroom and I was walking through and I just, I, and I posted on my Instagram.
[956] I think I posted, blessed, grateful and enough, just the three words.
[957] Because I have that sometimes, this overwhelming feeling of like, yeah, and I...
[958] So lucky.
[959] And it's so lucky.
[960] And you know, that is a choice.
[961] That was a choice to have those thoughts, to have those thoughts about like, enough.
[962] I have everything I need and to be quite honest, way more.
[963] And to be fair, I always have.
[964] before the money.
[965] I've always been enough, you know, and it was society's attempt to convince me that I wasn't.
[966] $4 T -shirts.
[967] How many of those can I buy?
[968] You think about it.
[969] And yet some of us waste an entire lifetime trying to get more Armani suits.
[970] Why?
[971] Because of the Matrix, whatever they call it, you know.
[972] So you're smart.
[973] You're so smart to be able to make the money to buy the Armani suit.
[974] And yet you're still a sucker for the people that are, for the eyes of the people that are going to think of you as more than you are because you're wearing it?
[975] Seriously, I mean, if it pleases you, by the way, nothing wrong with being fashionable and beautiful and taking care of yourself, absolutely.
[976] But if you're doing it for the ego, what does that say about you?
[977] You're so freaking successful.
[978] And you're still, you know, expecting that people will value you more because you're wearing a suit?
[979] I don't know.
[980] There's something you said as well on that exact point, but also on your previous point about going on these days.
[981] and saying to the young person or the, you know, the lovely person you're on the date with, listen, this is a $4 t -shirt and I'm going to wear it a lot.
[982] So if you don't like that.
[983] I wear 10 of them.
[984] It's not the same t -shirt.
[985] I have 10 of them at any point in time.
[986] Don't spoil my reputation here.
[987] I was going to say, well, you know, it's very similar to the one you were wearing.
[988] There's three pairs of jeans and 10 black t -shirts.
[989] You said there, you said, if she's okay with that, then in fact, she's actually not the person for you.
[990] Absolutely.
[991] And actually trying to forge a relationship that is on that basis would probably lead you to a not so good place because your life would become a hamster wheel of valuing that.
[992] So conditional love.
[993] You talk about this concept of conditional love and I don't necessarily know what you mean because I didn't get the full definition.
[994] But what is conditional love and what's the danger of it?
[995] Conditional love is, I love you because we're having this conversation and it's going to go to tens of thousands of people.
[996] Thank you.
[997] Okay?
[998] Yeah, but that actually ends if it doesn't go to tens of thousands of people.
[999] You know, I love her because she's cute and she makes me laugh.
[1000] Unconditional love is real love, okay?
[1001] Unconditional is, I love my son.
[1002] He pissed me off when he was young.
[1003] I loved him.
[1004] He taught me when I was young.
[1005] I loved him.
[1006] You know, he left me and caused me pain when he left.
[1007] I mean, it's not his choice, but, you know, and I still love him.
[1008] He's not even part of this world and I love him.
[1009] There are no conditions for my love.
[1010] There are no conditions for my love for butterflies.
[1011] Okay?
[1012] I love butterflies, even though there are, none in this room.
[1013] None of them is entertaining me. I just love butterflies.
[1014] It's just a feeling that I don't understand.
[1015] I can't explain with an equation.
[1016] And it's always there.
[1017] And it's the only kind of love, the only kind of love that makes us happy.
[1018] Understand that.
[1019] All of the other kinds of love are anchored in conditions, anchored in what?
[1020] In expectations.
[1021] And anything you anchor in expectations, sooner or later is going to change.
[1022] If you love her because she's beautiful, sooner or later she's going to grow a little older and someone else will be more beautiful.
[1023] What are you going to do then?
[1024] Okay.
[1025] If you love him because he's your business partner and making you a lot of money, what are you going to do when things go a little difficult?
[1026] Okay.
[1027] And the idea is when you go to unconditional love, something amazing happens.
[1028] Suddenly you're in control because the joy of unconditional love is to give it.
[1029] There are no conditions.
[1030] You're not expecting anything to, you know, conditional love is reciprocity.
[1031] Hey, I'm going to love you for this.
[1032] And in return, you're going to do this for me. Okay?
[1033] And of course, the other side of it is the ego of love.
[1034] It's like, I'm lovable.
[1035] You know, I'm going to create the conditions for you guys to love me. So look, I'm lovable.
[1036] I'm well done, Mo. Right?
[1037] You've created something amazing.
[1038] You should be proud of yourself.
[1039] None of that matters.
[1040] All of that gets you, of course, romantic love is wonderful.
[1041] and, you know, the love of, you know, partners in business.
[1042] Yeah, beautiful, wonderful, huh?
[1043] You want to get to the core of unshakable happiness that comes from your love.
[1044] Learn to love beyond conditions.
[1045] And if you can learn to love beyond conditions, promise you the world will love you back without conditions.
[1046] So the days when you're not at your best, you're still going to be loved.
[1047] Romantic love.
[1048] Let's take a pivot there then.
[1049] Oh, man. Before we started recording, we talked a little bit about, the modern world of dating and how difficult it is because of the way the world's changed and you know the battles with romantic love you know i know you've been in you were in a relationship for a long time yeah 28 years yes and then back on the dating market yes not doing great you're you were with someone for 28 years yes i'm like 28 years old i know that's a staggering you know oh she's an amazing woman and we're still best friends She's still, in my eyes, the best woman that ever existed.
[1050] And I think in her eyes, I'm still the best man that ever existed.
[1051] And I think it's a beautiful, beautiful, beautiful, beautiful connection.
[1052] And it's doable.
[1053] And it's actually, if anyone needs to experience that once in their life.
[1054] Do you know what question I'm going to ask you?
[1055] No. If you both think you're the best people that ever existed.
[1056] Huh.
[1057] Love and relationships are two different things.
[1058] love is feeling relationships are compatibility and fit and work okay and progress and projects and you know and partnerships and lots of and lots of things relationships so in the book I'm working on I actually have a section about finding love and a section about keeping love right because because they're not the same thing and I you know Nibel and I you know she's the most amazing woman ever and no at least from what she gave me in life i i will eternally be grateful everything that i am any comment that i said today was discussed one day with nebel right her spirituality her wisdom her you know her love made me who i am okay and you can't kill that because you no longer want to sleep together do you understand this you can't you can't be that stupid to take all of those beautiful relationships and just say, okay, that's it.
[1059] We're separated.
[1060] We don't want to talk, right?
[1061] The thing is, I believe that love is short -lived.
[1062] I believe that Nibel and I had to fall in love six times over the 28 years, because we both changed every single time.
[1063] So she was my college sweetheart.
[1064] We had that amazing, you know, puppy love and wonderful romantic relationship and so on, and then we get married.
[1065] And then she becomes a different person and I become a different person.
[1066] And you suddenly go like, where is my sweetheart?
[1067] Okay.
[1068] And you have a choice then either to walk out and say, I'm going to go look for my sweetheart.
[1069] Or in our case, we go like, oh my God, she's gone that sweetheart, but my God, this one is so cute.
[1070] I love that one, right?
[1071] And we fell in love again and again and again six times until our path went literally opposite ways after where my, my path went into, okay, I'm going to write a book and tour the world and do more of what I do.
[1072] And her path went into, okay, it's time for me to start focusing on my own life.
[1073] I want to, you know, focus on my own business, focus on my own stability.
[1074] I don't want to travel the world like a maniac.
[1075] And it became difficult.
[1076] It became difficult to go back every two weeks when we haven't met and feel guilty that I have not been there for her and she hasn't been there for me. And so one day we sat down, we spoke, we hugged, literally hugged.
[1077] And then, you know, said, okay, maybe it's time to try another experience.
[1078] And, and then we went back to the same home and spent another week together.
[1079] And then I left.
[1080] And it's beautiful.
[1081] It's beautiful as it is.
[1082] I think, you know, we still carry each other's credit cards and we still, you know, manage our investments together.
[1083] And there is total trust and total, you know, understanding.
[1084] and we're still, you know, parent Aya, our daughter together.
[1085] And it's wonderful.
[1086] It's just that romance is one part of the different melody of loves that you can feel for someone.
[1087] Such a beautiful level of maturity and, I guess, love, you know.
[1088] Yeah, of course.
[1089] I mean, you should meet her.
[1090] I think it becomes easier if you understand how she is.
[1091] So pandemics, let's talk pandemics.
[1092] From rough to pandemics Yeah, I mean, this is exactly what I felt When I was reading about your story It's such a diverse range of topics That in so many beautiful ways intertwine And are influenced by each other I read that you In a recent sort of news article that had just come out I think in the last couple of weeks That you feel that we're currently engaged In a productivity pandemic Well, engaged in many pandemics The truth is that we are focusing on the silly ones what is the real one the real one is artificial intelligence no doubt why because it's here to stay and evolve and become bigger and bigger and bigger and influential in ways that we have not even started to consider yet COVID is here to come and go for someone that doesn't know what artificial intelligence is which is a lot of people yeah probably more than 95 % of the listeners of this podcast what is artificial intelligence great question so so the reason for scary smart my book is entirely around that that there are so many people out there that have no clue that they have interacted today whatever time of the day it is for you whatever you are whatever you do in life you've already interacted with 10 to 12 15 20 maybe 50 artificial intelligences that are all smarter than you that's the truth of our life artificial intelligence there was a turning point in the history of technology where before which all technology was programmable so it became it it was simply a tool that extended the capability of humanity, okay?
[1093] You can't hammer a nail with your hand.
[1094] You use a hammer, you know, you can hammer a nail with the hammer.
[1095] And the hammer will do exactly what you tell it to do.
[1096] Beyond the turn of the century, we've discovered something that's called deep learning.
[1097] And deep learning allows machines to learn on their own.
[1098] We don't understand how they learn.
[1099] Developers that write the code don't even understand how they learn, but they develop intelligence.
[1100] They become able to make autonomous decisions based on intelligent observations and in a sense of their environment and the sense of the conditions that surround them.
[1101] And they make those decisions on every specific task we've given them better than humans.
[1102] Something you said scared me before we start recording, which was that you know, you're a very smart guy, right?
[1103] And your professional experience, especially working at a company like Google, which is known for its artificial intelligence capabilities, for you to say to me that you've basically given up your summer to go around the world, talking about your new book, Scary, Smart, and the implications of artificial intelligence, and you're basically choosing to, you know, because you understand the importance of time and you could be doing anything.
[1104] For you to consider this, the important work of your life in this period, begs the question, why is it so important?
[1105] Why are you giving up so much to spread this message?
[1106] It is the single most important message on our planet today.
[1107] I don't think people realize.
[1108] So AI today is better than humanity in everything it does.
[1109] By the year 2029, the smartest being on planet Earth is going to be a machine.
[1110] Now, I just, I don't want to, we can go into the details, huh?
[1111] But I want you to imagine a scenario on planet Earth.
[1112] where we're not the humans, but the apes, okay?
[1113] Where there is another being that looks at us as the apes, okay?
[1114] And that being is going to be smarter than us.
[1115] You heard me correctly in eight years' time.
[1116] Eight years' time.
[1117] Eight years' time.
[1118] And we're not talking about it, Stephen.
[1119] What's going wrong with humanity?
[1120] We're not talking about it, okay?
[1121] If you look at AI from the inside, you realize that through the law of accelerating returns, Ray Kurzweil basically predicts, that by the year 2045, it's in your lifetime and mine, AI is going to be a billion times smarter than humans.
[1122] One billion times smarter.
[1123] You know what that means.
[1124] This is comparable to the intelligence of Einstein as compared to a fly.
[1125] And humanity, while we're still Einstein, is not discussing how are we going to keep the best interest of us, the fly, in Einstein's mind.
[1126] Okay?
[1127] We're talking through the arrogance of humanity about how we're going to control them, how we're going to box them, how we're going to tripwire them.
[1128] Good luck with that.
[1129] The biggest hacker in the room, the smartest hacker in the room will always find a way through our defenses.
[1130] Now, scary smart is written in a very unusual way because it's not a book about artificial intelligence only.
[1131] It's a book about the role of humanity in the age of the machines.
[1132] And it's split into two parts.
[1133] The first part is the scary part.
[1134] And if you and I dive into this, I promise you you're going to be scared.
[1135] The second part of it is what I call the smart part.
[1136] So five chapters are scary like hell.
[1137] Most of my early readers would call me after five chapters and say, should I commit suicide?
[1138] Right.
[1139] And then the other five chapters are a story of hope.
[1140] Basically, it's entirely within our hands to do something so quickly, so simply that can save our world.
[1141] Okay.
[1142] Now, the difference between those two is a question of awareness.
[1143] that people are preoccupied talking about COVID and what the prime minister decided and the next profitability quarter on our business and people are not talking about the existential challenge that we have ahead of us, okay, which is imminent eight years and 2045.
[1144] So I start the book with a thought experiment.
[1145] I say you and I are sitting in front of a campfire in the middle of nowhere in 255, okay?
[1146] I'm going to tell you the story of what happened between 2021 and 20505 from, that perspective, okay?
[1147] I'm not going to tell you why we're sitting in front of the campfire.
[1148] Why are we hiding in the middle of nowhere?
[1149] Is it because we're hiding from the machines?
[1150] Or is it because the machines have built a utopia that allows us to enjoy nature and connection and the luxuries of life?
[1151] Right.
[1152] And the difference between them is really straightforward.
[1153] The difference between them is what you and I and everyone listening, not the developers, not the government, okay?
[1154] Not the regulators.
[1155] What you and I and everyone, listening are going to teach those machines because those machines don't learn from their developers the minute they're out there they learn from your swipes on instagram they learn from your retweets they learn from your fights your arguments from this conversation and if we don't shape up as humanity at least some of us enough of us those machines will magnify the essence of what we are today as humans and that's not really pretty are you optimistic about this i am 100 % optimistic okay and it's a great question to start with um i believe eventually eventually we're going to end up in a utopia whichever way let me explain we humans have been able to create this in amazing setup that you and i are sitting in to record and communicate with the world because of our intelligence but we've also destroyed the planet because of our limited intelligence so you know we found a way to create a supply chain that can supply Mo with a slice of watermelon right around the corner in a supermarket because we're intelligent, but we could not be intelligent enough to not use single -use plastic.
[1156] We created mobility because we're intelligent, but we could not be intelligent enough to stop burning fuel to kill the planet.
[1157] It's our limited intelligence that is the hindrance of humanity.
[1158] And AI is surpassing our limited intelligence very quickly to the point that it will get to the ultimate form of intelligence.
[1159] And what's the ultimate form of intelligence?
[1160] The ultimate form of intelligence is the intelligence of life itself.
[1161] It's the intelligence of abundance where AI would see no reason to crush the fly.
[1162] You know how life is.
[1163] Life will say more flies, more, you know, antelope, more tigers, more poop, more everything.
[1164] Let's just, you know, let's have more of everything and everything will thrive.
[1165] This is where AI will get 100%.
[1166] I have no doubt about that.
[1167] the challenge is the journey from here to there do we want that journey to be smooth and straightforward and wonderful or do we want to hit bumps on the way and if we don't want to hit those bumps on the way it's not about their intelligence it's about their ethics it's about the ethics of the machines and that's a very very important conversations they're no longer machines they will develop a code of ethics they are independent they're sentient in every possible way And this is truly the core of scary smart is an explanation that we're no longer building computers.
[1168] This is not the programmable technology of the pre, you know, turn of the century.
[1169] This is autonomous, it is independent, it is intelligent, it evolves, it procreates.
[1170] That's terrifying.
[1171] It is terrifying.
[1172] And we're not talking about it.
[1173] For anyone that doesn't know what procreates mean, basically creates more of itself, I guess.
[1174] It replicates.
[1175] We, humans, we take a nine -month cycle.
[1176] If we find the right person and we're in the mood, they take microseconds to create 700 million copies of themselves if they wanted to.
[1177] Can I ask a really specific question?
[1178] Because whenever we go into this conversation around machine learning and AI, people think of it as robots, sort of like marching down the streets with guns because we've seen that's the image we've had portrayed to us in movies like Terminator and such.
[1179] Yeah, that's never going to happen.
[1180] Okay.
[1181] So what is the realistic practical threat?
[1182] So let's just take that moment, that point for a minute, huh?
[1183] One of the most interesting part of AI being sentient is that it has agency.
[1184] Okay.
[1185] And that agency is available through robotics, whether that's a humanoid work walking down the streets and there are many killing robots being created as we speak, okay, in the US army and in the, you know, in the Chinese army and so on.
[1186] But that's not the challenge.
[1187] The agency they have is over your mind.
[1188] And that's what most people don't understand.
[1189] Let me give you a simple example.
[1190] I, yeah, my daughter loves cats.
[1191] So when I swipe on Instagram reels, I look for cats, send her as many as I can, good ones.
[1192] And then when she sends me back a smile, my life is made.
[1193] right through that process one time i realized that one of the types of reels on instagram is people playing rock solos i'm a junkie for rock solos and so i clicked on the first one it was a very talented young woman uh holding a serious metal guitar and playing some metallica something or whatever i was like man that's amazing i clicked like then into instagram recommendation engine said okay i can capture this one seems to like this let me show him more so two pages later i started to see other rock music solos being played by men they played songs i didn't like so i swiked away the next morning my entire feed was filled with women playing rock music because in instagram thought i didn't like the men i liked the women not that i didn't like the songs and i liked that song now if you think of that simply without knowledge of the real world, you would think that rock music is dominated by women, guitarists.
[1194] And the truth of the world is no, rock music in its, you know, in its generation was completely dominated by male guitars.
[1195] Now, it's not for male or females.
[1196] It's just that your view of the world is entirely skewed by a machine.
[1197] Okay.
[1198] And that view of the world can go into any ideology can teach you anything that it wants or that it believes it you want based on machine intelligence.
[1199] That kind of agency can change societies.
[1200] And we're handing over that control entirely to the machines.
[1201] There is no employee at Facebook that gets consulted, should I show more female guitars.
[1202] It's entirely in the hands of the machines.
[1203] And of course, you know, seen the movie idiocry.
[1204] What is happening is that the machine is populating more and more idiocry.
[1205] The machine is getting people who are clicking on stupid stuff to see more of the stupid stuff so that their view of the world is more stupid.
[1206] Now that kind of agency is massive.
[1207] Now, let's go back to robots and machines and so on.
[1208] A self -driving car is a kind of robot.
[1209] Okay, it just doesn't look like a humanoid.
[1210] A flying drone delivery drone is a kind of robot.
[1211] An autopilot that lands a 747 is a kind of robot.
[1212] And all of those are now going to be controlled by machines.
[1213] Defense arsenals are going to be controlled by machines.
[1214] And, you know, the trigger that launches a rocket is a kind of robot.
[1215] Now, the challenge we have as humanity is this.
[1216] AI is never going to get to the point of what we saw in RoboCop or I Robot or whatever, because we will not live long enough to get there if the intention of AI is against us.
[1217] The kinds of, chapter four of the book is called a mild dystopia.
[1218] And in the mild dystopia, I speak about realistic scenarios that are horrifically scary.
[1219] And if you've ever watched horror movies, if you remember Halloween, when I was a child was the most horrifying because Halloween could actually happen.
[1220] The kinds of scenarios are simple.
[1221] They are machine versus machine.
[1222] We've seen that in Black Monday when machine trading, machines trading versus each other would collapse the market by 22 .6 % when the humans would take hours until they can actually intervene.
[1223] Machines versus machines is happening all the time.
[1224] Most of the market now is traded by machines.
[1225] Of course you can imagine.
[1226] that this will happen more and more.
[1227] So imagine if one of the superpowers in the world puts its nuclear arsenal under the control of an AI because it's quicker to take action, then the other side will probably do the same.
[1228] And then suddenly we've handed all the nuclear weapons to machines.
[1229] These possibilities are absolutely going to happen.
[1230] Possibility of machines siding with the wrong guy.
[1231] Just like you can use AI to find solutions for climate change, you can also use it to develop advanced viruses.
[1232] Okay, and probably, you know, identity theft or, you know, breaking through bank statements and so on.
[1233] So the same kind of technology can actually develop patterns that can take us to that direction.
[1234] Take machines not understanding what we mean.
[1235] Like Instagram didn't understand that I was interested in Metallica, not in just, you know, Lady Rockers, okay?
[1236] They didn't understand.
[1237] Can we blame them for that?
[1238] But it's a very realistic scenario.
[1239] Now, imagine if we told human, you know, the machines, okay, we want to be happier.
[1240] What would machines do?
[1241] Pump us up with dopamine.
[1242] They didn't understand.
[1243] We didn't tell.
[1244] We're not very good at explaining what we want.
[1245] And then there are bugs, simple mistakes.
[1246] And there has been quite a few examples of those so far in AI, where there is a bug in the system, and the bug in the system leads to a mistake.
[1247] Those scenarios are a lot more realistic, and they're around the corner.
[1248] and we're not talking about.
[1249] Even though that was a short monologue, I'm now totally convinced.
[1250] We have to wake up.
[1251] It is real.
[1252] I'm going to make sure I can finish your book in Tartan.
[1253] I would really implore everybody to read your book because I know that the way that you see the world, the perspective you come from, and especially the fact that this book isn't one that's about explaining the technology itself.
[1254] It's about the real near -term implications of the technology that will impact all.
[1255] of our lives in considerable ways and are at the moment, I really consider it to be one of the most important books of our time.
[1256] Oh, thank you.
[1257] And I have to say to really conclude on that point, what therefore is the answer?
[1258] The answer is very straightforward and it's very shocking if you think about it.
[1259] Straightforward, I like.
[1260] Yeah, yeah.
[1261] I mean, the truth of artificial intelligence is the key answer is found in the concept that they are no longer machines.
[1262] Okay.
[1263] You have to internalize the idea that these are sentient beings.
[1264] And the difference is if you're dealing with a machine, what you normally do to safeguard yourself is to create restrictions.
[1265] You chain it to a wall or you have a safety, you know, a red button or whatever.
[1266] Yeah, yeah.
[1267] I'm plug it.
[1268] Yeah.
[1269] If you're dealing with sentient beings, sentient beings don't make decisions based on intelligence.
[1270] Sient beings make decisions based on ethics as informed by intelligence.
[1271] So, so, you know, simply when you really think about it, which is, again, one of my favorite chapters in the book is a chapter that basically speaks about, I call it raising our future, that speaks, basically speaks about the fact that those machines are going to be conscious.
[1272] They're actually going to be more conscious than humans.
[1273] They're going to be emotional.
[1274] They'll have a much wide breadth of emotions than we do, just like you have more emotions than a jellyfish.
[1275] If you extrapolate that, you know, the more intelligent we become, the more able we are to analyze things and accordingly the more emotions that we can, you know, feel as a result.
[1276] And the third and most important is that they will actually develop morals and a code of ethics.
[1277] So they will have a view of the world through which they decide what they should do, okay, to ponder what that kind of ethical code should be when there are digital beings and biological beings and the interaction between them and their perception of time and their perception of, you know, of objectives and targets and so on, it suddenly becomes quite intriguing when you still, and we're not talking about it.
[1278] Okay, we're not talking about any of this.
[1279] So the game is this.
[1280] The game is, they're going to make their decisions as sentient beings based on ethics, how do you raise a sentient being that is ethical like you raise a child?
[1281] The answer for me is that we, you and I swiping on Instagram and tweeting, we need to find ways to show the machine that humanity is not represented by the scum of humanity, but is represented by the top of humanity.
[1282] And most people, when I tell them this, they go like, oh man, we're in trouble.
[1283] Humanity sucks.
[1284] No, humanity does not suck.
[1285] I apologize.
[1286] Humanity is an amazing, amazing, amazing being when it's defined by its top.
[1287] If you've ever fell in love, if you've ever, you know, listened to music, if you've ever observed the work of art. This is divine.
[1288] We're an amazing species.
[1289] The worst of us is scum.
[1290] The best of us is amazing.
[1291] Okay.
[1292] The problem is the best of us does not show to the machines.
[1293] The best humans actually resign.
[1294] They go like, yeah, let them swipe.
[1295] Let the dog fight continue.
[1296] I'm not interested in any of this.
[1297] It's time to show up.
[1298] And the best of you and I individually.
[1299] In the past, humanity, every individual tried to show the best of them.
[1300] Now we show the worst of us.
[1301] We show narcissism.
[1302] We show toxic positivity.
[1303] We show, you know, we show the worst of us.
[1304] And that has to shift.
[1305] So my answer is very straightforward.
[1306] For the machines to become amazing teenagers in 10 years' time, we need to become amazing parents today.
[1307] And that's truly the answer, the only answer I could find.
[1308] And I promise you, Elon Musk said that in his interview with Joe Rogan.
[1309] He said, the threat of the machines is comparable to the threat of nuclear weapons.
[1310] And I have lobbied.
[1311] I have tried to tell the world not to go there, but it's going to happen.
[1312] It's inevitable.
[1313] I call them the three inevitables.
[1314] the machines, AI will happen.
[1315] It will be smarter than us, okay?
[1316] And some mistakes will happen on the past.
[1317] It will happen.
[1318] But we could influence it as good parents, you and I and everyone.
[1319] And it doesn't have to be everyone, by the way.
[1320] We need 1%.
[1321] We need to instill doubt in the minds of the machines that we are not the worst of us, we are the best of us.
[1322] The example I always give is when Donald Trump used to tweet, The first tweet is Donald Trump.
[1323] The second tweet is insulting the president.
[1324] The third tweet is insulting the guy number one and the fourth tweet is insulting all of them.
[1325] The machines are creating patterns.
[1326] They're saying, okay, the second guy doesn't like the president and all of humanity likes to insult each other.
[1327] Okay?
[1328] Can we have seven tweets inserted in there that are respectful?
[1329] Can we show the machines instilled out in the minds of the machines so that they look at the world and say, oh my God, Hitler was horrible.
[1330] But Hitler is not all of humanity.
[1331] As a matter of fact, most of humanity resents that.
[1332] Most of humanity believes that this is wrong.
[1333] Mommy and daddy are good beings.
[1334] The only problem with this podcast is that it has to come to an end at some point, honestly.
[1335] Honestly, just absolutely mind -blown.
[1336] And you know the really remarkable thing about you is you're able to explain very complex ideas and relatable ways?
[1337] So that whole time, you didn't lose me once.
[1338] It's also because of your great passion.
[1339] these subjects, and you can tell you're fighting battles that, you know, you sincerely care about.
[1340] And I just hope that if we are to become better parents with whatever help we need from the big tech companies, well, I mean, you're saying that it's more of a reflection of who we are.
[1341] That day in 25 where we're set around that campfire, it's because we've chosen to go there for a meditation retreat as opposed to escaping the sentient beings that are controlling our lives.
[1342] and that's a tremendously thought -provoking thing to, I think, to end on.
[1343] Your book, Scary Smart, is available now for everybody.
[1344] Yeah, absolutely.
[1345] Amazon.
[1346] Everywhere.
[1347] Your book shop, online, on Amazon, on audible, on...
[1348] All over the world as well?
[1349] All of international English and Dutch are coming out at the same time.
[1350] And I really, really recommend anybody that wants to understand, but also to understand.
[1351] prepare themselves in an optimistic way for what you've described as the real pandemic of our time to go out and buy that book.
[1352] There is just one more thing.
[1353] I wanted to ask you before we wrap.
[1354] And we only started this last week.
[1355] The last guest that sat in the chair with you, I said to them at the very end, I said, can you write a question in this diary for my next guest?
[1356] They didn't know who it was going to be, but they wrote a question in the diary for you.
[1357] So I'm just going to skip to that page.
[1358] I've actually not read the question.
[1359] That's such a great, great practice.
[1360] Okay, here we go.
[1361] So the question that the previous guest wrote for you, and this is Jacqueline Gold, who is the longstanding CEO Van Summers, just one of the most remarkable stories, business stories.
[1362] I think she's the 14th richest person, woman in the country, and she's gone through tremendous adversity, lost a child, battled with cancer against all the odds, stage four cancer, sexual abuse.
[1363] And she talked about that last week.
[1364] The question for you, she wrote, is, what are the failures you cherish the most?
[1365] I have failed for many, many years to empower my feminine side.
[1366] It's my biggest failure ever.
[1367] Still is my biggest weakness, even though I've done so much better in the last five and a half years.
[1368] I think our world is suffering from hyper -masculinity.
[1369] And I say that with my weird, deep voice, but it's the truth.
[1370] We've turned it into a world of doing.
[1371] We just go out there and do stuff, mostly the wrong stuff.
[1372] Mostly stuff that we don't need, mostly stuff that doesn't nourish anyone, okay?
[1373] And it's because we've capitalized so entirely in our modern world on skills like analytical thinking, linear thinking, strength, you know, discipline, control.
[1374] All of these are masculine traits.
[1375] okay masculine and feminine is not man and woman masculine and feminine is traits that are correlated to the to the masculine and correlated to the feminine all those masculine traits when you overdo them they work against you you strength is good you overdo it you become aggressive linear thinking is good you overdo it you become stubborn and we've ignored the feminine qualities that are life -giving nurturing and you know um intuitive creative, creative, playful, flowy, beautiful.
[1376] All of these empathetic.
[1377] We've created a world that is so lacking in all of those.
[1378] And I'm to blame, to become a successful executive.
[1379] I had to empower, empower the masculine side until I realize that true leaders don't do we be.
[1380] And being is what the feminine is about.
[1381] Our humanity is on the wrong side.
[1382] of being we're not showing our good sides we're not able to nourish life give care we're not able to and i had a i mean most of my my work is a very confusing marriage between sometimes what is physical and you know measurable and concrete and mathematical and and what is not physical sometimes I have to ponder topics like death and spirituality and so on.
[1383] And one morning I woke up five and a half years ago and I heard my left brain tell me, that's it.
[1384] That's as far as I can get you.
[1385] Without being able to connect to all of being, to go outside that shell of me versus the world, which is the masculine, we're not going to go anywhere further.
[1386] If there is anything that I have failed miserably to do was to do that early enough.
[1387] And if anything, our world is, is failing to do is to embrace that side.
[1388] Sadly, as we empower women today, we force them to become masculine.
[1389] We force them to become competitive.
[1390] We force them to become tough because the way the game is played is that way.
[1391] We should empower the feminine.
[1392] And it's so funny when you really think about it, even someone like Steve Jobs or Gandhi or who are men in their biology, they succeeded because they empowered their feminine.
[1393] Steve Jobs' creativity or appreciation of beauty, or empathy for the user needs.
[1394] That's what made him, Steve Jobs.
[1395] Being obnoxious and annoying, that's what took away from it, the masculine side.
[1396] And it's about time that the world wakes up to this.
[1397] When we raise the machines, by the way, going back to scary smart, are we going to raise them to be masculine geeks?
[1398] Or are we going to raise them to be life -giving?
[1399] I think really this is the biggest failure ever.
[1400] For me and I think for all of our society.
[1401] Oh, fuck.
[1402] That was really powerful.
[1403] And it's so true.
[1404] And you know what?
[1405] I was thinking that whole time, my, my girlfriend is in Bali at the moment.
[1406] And she's been talking to me extensively about being more in touch with the feminine and what they call, you know, yin, which is, you know, the yin and yang energy.
[1407] So it all rang very, very true.
[1408] And I think you really helped me make sense of all that in a very, again, relatable, understandable way.
[1409] And I'm going to, I'm going to ask you to carry on this tradition by writing.
[1410] writing a question into the diary, which will be shown to my next guest, but it will remain a secret until then.
[1411] Listen, I can't thank you enough.
[1412] You know, sometimes I thank people for the time.
[1413] I think, I want to thank you for the lessons that you've taught me in this conversation that will really make my life a lot better.
[1414] And, you know, the thing is as well is I sit here every week imparting what I know.
[1415] So long after you walk out of this door, because you've left me with those lessons, I'm going to spend the rest of my life on this podcast talking about them.
[1416] Great.
[1417] as I do.
[1418] And also talking about your book, right?
[1419] So there's a couple of guests that I encounter and I just spend the next 10 years of my life just battling on.
[1420] Thank you.
[1421] But you're one of those really, really profound people.
[1422] And I understand, we said this before we started filming, but my manager, Don Murray, when he started on like the first week of his job here, and he started learning about the podcast, he said to me, he said, you've got to get this guy called Mo on.
[1423] He's Google X. And I was like, I don't, you know, just I had the name.
[1424] You're the man. You're the man. I'm great for.
[1425] And he kept saying it.
[1426] And then it wasn't just him.
[1427] And this is where it got really sort of reinforced.
[1428] Other people were coming on the podcast telling me that I had to get you on over and over again.
[1429] And you hear that you hear this name three, four, five times.
[1430] You think, fucking hell, okay, there's something that I need to, you know, this is definitely something I need to do.
[1431] And then, you know, someone in our team said, oh, who's in London.
[1432] So we, we had to reach out and I'm so deeply grateful that you came.
[1433] I'm so grateful that you did.
[1434] I'm really grateful for the time that you gave me and the opportunity to share some of what I, I'm pondering.
[1435] It's not right, but it's really worth thinking about and it's a great great thing you're doing for the world by sharing that honestly i don't say this i'm not i don't gas my guests up like this but it is really of tremendous value and you know just i i want to thank you from the bottom of my heart thank you thank you