The Joe Rogan Experience XX
[0] Joe Rogan podcast checking out the Joe Rogan experience train by day Joe Rogan podcast by night all day what's up Nolan nothing much you guys hear me through this yeah far away cool no it's perfect it's perfect it's a pleasure to meet you man hey you too and you too thanks for having me I have a feeling if there's a movie that they do in the future of how the world changed in 2024 you're gonna be in that movie Yeah, that would be cool.
[1] Yeah, that'd be cool.
[2] I wonder if they'd get to play me. They probably don't need people by then.
[3] They probably just do movies with AI and probably really quickly.
[4] You could probably like take a really great novel like The Great Gatsby, run it through an AI video creator and it would just make you the most amazing version of the Great Gatsby.
[5] Yeah, that's true.
[6] Probably.
[7] Yeah, that'd be sick.
[8] But if we're talking about like historical moments in human beings and in technology, the implementation of Neurilink on the first human patient, that's you.
[9] Yeah, yeah, I guess so.
[10] No, definitely.
[11] Yeah.
[12] Yeah, I mean, I was, I keep thinking about it like, you know, BCI have been around for a while.
[13] So I've told people.
[14] What is BCI?
[15] Brain computer interface.
[16] So like just implants that they've done in people, different ways that they've found, they've given people the ability to like control electronic devices.
[17] They've been able to control computers and stuff.
[18] There are a couple things out there.
[19] The Utah Ray Synchron came out with something where basically they go through the artery in the neck and that they kind of thread something up into the brain.
[20] It expands in a vein up there and an artery up there.
[21] And then they can like control the brain through that.
[22] So BCI's been around for a while, a few decades at least, I think since like the 90s.
[23] So I always say that we're standing on the shoulders of giants sort of thing, but I know Nererlink just as, it's in a league of its own.
[24] And I know that, you know, with Elon's name attached to it, it's going to blow up way more.
[25] But I think this is the beginning.
[26] I think everyone else that comes after this, basically, is going to be pulled up by the progress Nererlink's making.
[27] And the fact that they are trying to open source basically all of it, I think the whole field is just going to grow exponentially at this point.
[28] Well, we can only hope so.
[29] And that really is fascinating.
[30] And it really is fascinating how many different ways and strategies they've employed to try to connect computers to human beings and brains.
[31] So do you know what year the first one was that they did this?
[32] 98.
[33] Oh, wow.
[34] Yeah, I think so.
[35] I think that was the Utah Ray.
[36] That just was, it looks like a chip with like more fixed like threads on it.
[37] They were, I think, a lot smaller.
[38] And it just sat on the brain.
[39] So obviously another open brain surgery and they put it in there.
[40] And then it would read a section of the brain motor cortex, I think, as well.
[41] Have you seen some of the stuff now where they're using some kind of scanning imagery where they can actually see thoughts?
[42] No, I haven't.
[43] Yeah, they're doing where they think they're going to be able to record dreams eventually.
[44] And what they're able to do now is get like an approximation of what someone is seeing and thinking.
[45] Whoa.
[46] Can you find that, Jamie, so we could figure out exactly what they did?
[47] Yeah.
[48] Here it is.
[49] Scientists read dreams using brain scans.
[50] Is an older one?
[51] This is not the newer one.
[52] Okay.
[53] That's crazy.
[54] I mean, I've always heard that scientists really don't know how, like, what dreams are and, like, what is going on or why we do it.
[55] I've heard plenty of people say, like, yeah, we still don't know why you even need to sleep or, like, what's going on in a dream?
[56] I don't know if that's changed recently, but...
[57] Like, I don't know, dreams are, dreams are an interesting thing.
[58] The whole sleep thing is interesting.
[59] Yeah.
[60] MRI scans reveal what we see in dreams.
[61] Japanese researchers unveil dreams visuals with 60 % accuracy.
[62] What?
[63] Using innovative MRI scans at Pivotal Kyoto studies showcasing a breakthrough in sleep science.
[64] Whoa.
[65] Wow.
[66] Wild stuff.
[67] That picture just like AI.
[68] Are we dreaming an AI now?
[69] I think we're close.
[70] Yeah.
[71] I think if the simulation is real, it seems ridiculous now, less so than it seemed five years ago, but I think five years from now it'll seem likely.
[72] I think it's all interconnected in some very bizarre way.
[73] I think we were slowly building toward that connection with all of this technology and all of these new innovations and all of a greater understanding of quantum physics and space and all these.
[74] As they build on all this stuff, I think it's going to become more and more likely that this whole thing somehow are another real but not real at the same time.
[75] Yeah.
[76] Neither a simulation nor like actual reality, like a hybrid of these things.
[77] Oh, yeah.
[78] That'd be crazy.
[79] That's one of the things I'm really excited about with NeuroLink is how much we're going to learn just about the brain from this.
[80] Like the amount of data that they're collecting.
[81] I mean, little things like...
[82] The fact that all this stuff with the thread pullout going on with my brain, one of the reasons that they think it happened is because, well, I don't know.
[83] Have you heard about, like, the thread pullout and stuff?
[84] Yes.
[85] So basically, there are 64 threads implanted in my brain with 16 electrodes on them each.
[86] And over the...
[87] course of a month, we saw a lot of the threads start retracting from my brain.
[88] So the threads that the robot implanted were retracted.
[89] And so we were getting less signals from a lot of them.
[90] And they can't see that on like brain scans or anything.
[91] So like the threads are so small, you know, not even the size of a human hair, that in order to get a scan of them, you'd have to use such a big machine that it would probably just fry my brain.
[92] So they can't just go in and look at them.
[93] So a lot of the data that we have that shows that they were moving or coming out of the brain was literally just whether or not the electrodes on the threads were sending signals anymore if they were picking up neuron spikes.
[94] So a lot of the threads were getting pulled out and that led to some decline performance for a while.
[95] They kind of fixed that in a way.
[96] But some of the reason that that happened, at least we think, is because the brain moves more than they thought it would, which is something that was so bizarre to me when I first heard that.
[97] I was like, you guys don't know how much the brain moves?
[98] Like this feels like, like that should have been something that was solved, you know, ages ago.
[99] I never even thought it moved.
[100] Yeah, so it pulses, like with, with your brain, with your heart, I mean.
[101] So, like, as your heart pulses and stuff, your brain pulses as well.
[102] Because, you know, there's blood running through it and everything, so it's just pulsing.
[103] And they thought that it...
[104] move like pulses at about a one millimeter rate.
[105] So that's how much it'll pulse like move is one millimeter.
[106] And they found in my brain that it was moving three millimeters pulsing.
[107] So that's on a scale of three X times that they had made the whole neuralink and the threads and everything for to be able to withstand.
[108] So they think that that might have had something to do with it as well.
[109] So is that a normal thing that like does the brain have a range?
[110] Yeah, I don't know.
[111] I think that's going to, we'll know more.
[112] Stiffness pulsation of the human brain detected by non -invasive time.
[113] The human brain pulses every time the heartbeats.
[114] Scientists have used the tiny jiggle to reveal new insights about our neurons.
[115] Neuroscientist, try that name.
[116] Yuli Routishauer, Uli Routishauer.
[117] Ph .D. thought he'd uncovered a strange new phenomenon about the human brain.
[118] So it pulses every heartbeat.
[119] So if your heart beats a lot, if your heart's beating fast, if you're jacked up, does your brain pulse fast too?
[120] Yeah, I'm sure.
[121] I mean, I get like...
[122] What happens with me is if my heart rate higher, I'll get, like, headaches and stuff.
[123] So, like, I have a lot of weird things with my, with my body, with being a quadriplegic where, like, I can tell, like, if I have really high blood pressure, my head just gets, like, really, really, like, I get really bad headaches and stuff.
[124] But, yeah, so...
[125] brain moves more than we thought it did which blew my mind once we get more people in the study then we'll really know if for some reason my brain just moves a lot more than it should I imagine that we'll see something around the same and then we will be able to determine like a range like you're talking about if it's you know a range of one millimeter to say five millimeters or if it's pretty consistent around three millimeters I'm not sure So what this implant allows you to do is you can interface with a computer and you can use keyboard, you can type in URLs, you can play video games.
[126] Yeah.
[127] How does it work?
[128] Yeah.
[129] So basically, excuse me, my...
[130] Implant has like a Bluetooth connection to the computer.
[131] And then through that, NeurLink has created an app that they have uploaded to the computer.
[132] And through that app, I can interface with the computer.
[133] What it does is all of the electrodes on the threads are sending neuron spikes, neuron signals.
[134] And...
[135] Through my, so it's all implanted in my motor cortex, through my intentions.
[136] So say if I want to try to, you know, move my hand left, right, up, down, I can't really move it.
[137] I have like a little bit of movement in my hand, but I can't really move it.
[138] But the neurons are still firing.
[139] That intention is still there.
[140] So like those signals are being sent.
[141] There's just a cutoff in my spinal cord.
[142] So obviously it's not getting down.
[143] But it's still going on in my brain.
[144] And those electrodes are picking up those signals.
[145] And there's an algorithm like machine learning going on in the background that is taking those intentions.
[146] And over time, it is learning what I'm trying to do.
[147] And that translates to cursor control.
[148] Oh, shit.
[149] Yeah.
[150] So if I want to try to move the cursor to the left, I move my hand to the left.
[151] But that's not necessarily what I would need to do.
[152] If I wanted to move the cursor to the left, I could kick my foot or I could do any sort of like motor action to train it to learn that's what I want it to do to go left.
[153] So there will be like a visual on the screen that says like move your hand to the left and then they will train that left movement to left on the cursor control.
[154] But that visual could be anything.
[155] It could be like do a little jig and that'll move it to the left.
[156] Like anything that it can do, anything you can do, I mean it can learn and you can map that to anything.
[157] Yeah.
[158] So does this include facial movements?
[159] Yeah, yeah.
[160] So you could, like, move it with your nose.
[161] Yeah, I'm pretty sure.
[162] Like, we haven't tried anything like that.
[163] We haven't tried, you know, a lot of stuff.
[164] This is very, very, like, it's still very new.
[165] So there are things that, you know, we've...
[166] We're working on what works well at this point.
[167] So like a lot of it is like my right hand stuff.
[168] We have mapped a lot of things to like individual fingers, hand movements in general.
[169] But we've done like left hand stuff.
[170] We've done like foot kick stuff.
[171] And it doesn't look like the signals are as good.
[172] But that also might be just due to the fact that some of the threads are pulled out.
[173] So when they fix that issue with the next people, then those things would be much, much better.
[174] And if that's the case, then you could theoretically do multiple things at once.
[175] It's not just, you know, you map, say, my right hand to the cursor control, then you map my fingers and my other hand and my toes to, like, key control.
[176] So I could be moving the cursor and typing at the same time with my toes or something.
[177] Wow.
[178] Yeah, there's a lot, a lot to explore with this.
[179] That's so interesting that it's tied to your mind telling different parts of your body to move.
[180] I'm very, obviously, very ignorant to this stuff.
[181] I thought like you were just using your mind and telling the cursor to go around.
[182] Yeah, so...
[183] It's something, that is true.
[184] So it's something that we differentiate.
[185] There are what are called attempted movements and imagined movements.
[186] So at the very beginning, I did a lot of attempted movement.
[187] Attempted movement is just what it sounds.
[188] Like, I attempt to move my hand in a certain direction.
[189] I attempt to move my fingers, like lift your finger up, down, left, right.
[190] I attempt to do something.
[191] And then the algorithm will take that and translate it to cursor control.
[192] But what I realized maybe a few weeks in was that I could just think Cursor go here and it would move.
[193] That, it blew my mind when that happened for the first time.
[194] Like, like I said, with everything going on in my brain, all of it still works.
[195] All the signals are still there.
[196] Like, I think something to try to move and the signal gets sent.
[197] So when I'm attempting to move my hand and the cursor's moving, and it's moving basically where I want it to, I'm like, yeah, that makes sense.
[198] It didn't really shock me that it worked.
[199] I assumed that it would work because all the signals are still working.
[200] It's just my spinal cord that's jacked up.
[201] Um, but when I moved it for the first time with my mind without attempting to move at all, it, like I was giddy the entire day.
[202] I could not believe what had just happened.
[203] And I think we're going to find that with a lot of things.
[204] Um, right now we are doing like, I'm trying to map like sign language, like the sign language alphabet in order to text, like write words and stuff.
[205] And it's pretty promising.
[206] It worked.
[207] I'm sure there's a video out there of me somewhere that Neurlink has of me spelling a couple words with sign language.
[208] Wow.
[209] So you're thinking in your mind or you're trying to get your hands to make the signs of sign language and then the computer interprets that as the language and types it out.
[210] Yep.
[211] And I think the same thing is going to happen where I went from attempting to move my hand to imagining just moving the cursor.
[212] I think it's going to be the same way with the texting.
[213] I haven't had this confirmed yet, but I don't see why not.
[214] I think at some point the computer is going to learn, like me trying to do certain letters, if, like, attempting it.
[215] At some point, I'm just going to think that letter instead of actually trying to move, and it'll type it.
[216] Wow.
[217] Because I think it's both like me learning what the computer is trying to do, the algorithm, and the algorithm learning what I'm trying to do.
[218] And so over time, it's just going to be completely thought -based.
[219] I don't see why it wouldn't get there.
[220] From what I've seen just with the cursor control, it makes sense that, you know, as I'm attempting, it's learning.
[221] And then instead of even needing to attempt, it'll just understand what I want to do and it'll do it.
[222] So you were saying that you were one of the first people to do this and there's going to be more people in the trial and that maybe they'll learn the things that are going wrong with yours.
[223] Can they do yours again?
[224] Can they redo it?
[225] Yeah, they could.
[226] It was something that, you know, when the thread retraction had happened, I was obviously pretty broken up about it.
[227] I thought that, so like when they told me, I didn't have very good control of the cursor anymore.
[228] It was really hard for me to get the cursor to go where I wanted it to go.
[229] I thought my time in the trial was coming to an end.
[230] And that's really hard.
[231] It's something hard to come to terms with because they had just shown me this whole new world, like all these new capabilities that I had.
[232] And they had introduced so many things.
[233] Like before that point, I had played video games for, you know, 10 hours without needing any sort of help.
[234] And it was hard to, you know, internalize that it could all be coming to an end.
[235] I know that it will at some point because I'll be out of the study and I won't be able to use it anymore.
[236] So my first thought was, can you guys go in and fix it?
[237] Like go in, take it out, put in a new one.
[238] And they basically said, we're not at that point yet.
[239] We're going to see if we can fix it.
[240] We're going to see if we can do things on the software side to fix it, which they ended up doing.
[241] It works better than it did before now, even with like fewer threads.
[242] So I'm glad we didn't because they learned a lot.
[243] If we would have just gone in and taken it out and put in a new one, they wouldn't have learned the last, like anything that they had learned over the last three months.
[244] They could go in and do it.
[245] They're not going to.
[246] I don't think that they need to.
[247] But at some point, I know that the whole point of Neurlink is to be upgradable.
[248] So at some point, they're going to go in, hopefully, and take it out and give me a better one.
[249] Wow.
[250] Now, what is the extent of your injury?
[251] Sorry.
[252] I dislocated my C4C5.
[253] And people keep calling it a diving accident.
[254] It wasn't really a diving accident.
[255] It was just sort of like a freak accident while I was swimming in the lake.
[256] So I dislocated my C4C5, which they told me was good because I didn't sever my spinal cord.
[257] It was just kind of like my spinal cord, like, bounced out of place for, you know, a split second and hopped right back where it was supposed to be.
[258] And so I cannot move anything.
[259] I have no control or sensation below my shoulders.
[260] I got a little bit back.
[261] Like I can move my hand a little bit, but not enough to do anything.
[262] Like I couldn't control a joystick or anything.
[263] So yeah, no movement or sensation below my shoulders.
[264] Is there anything that, have you looked into what they do with stem cells?
[265] Yeah.
[266] Yeah, I'm, so I applied for studies before NeurLink, and I never got asked to be in any of them.
[267] I never even heard back from anyone, which is kind of what I assumed would happen with Neuraling, honestly.
[268] But I had applied for things because I obviously don't want to be paralyzed anymore.
[269] I don't want to be a quadriplegic.
[270] So it would be great if I could get into something and have them fix as much of me as possible.
[271] I mean, even if I had more control over my hands, the amount of things that I could do would like skyrocket, like an order of magnitude, better.
[272] And my life would be better.
[273] My independence would be better, everything.
[274] Yeah, don't.
[275] I mean, I don't think it would hurt to try.
[276] And you familiar with a lot of these clinics, like the Cellular Performance Institute in Mexico?
[277] No. They do a lot of UFC fighters.
[278] They do, like, you can do things in other countries that you're not allowed to do in America because of, you know, regulations.
[279] But what they're able to do down there is they're going right into discs and they're alleviating people's disc problems where they're actually making the discs grow larger and heal people with back injuries.
[280] And I know I've read things about spinal cord injuries and improvements, but I would love to connect you with them.
[281] And they, you know, they're the experts on this.
[282] They'd be able to tell you like what the state of the art in terms of like what the research shows that stem cells can and can't do.
[283] Yeah.
[284] I don't think you could hurt.
[285] It's a healing thing, right?
[286] If you're getting some sensation a little bit better movement, maybe they could accelerate that.
[287] Yeah, that would be great.
[288] I'll connect you with them.
[289] Cool.
[290] I don't know if I'm allowed to at this point.
[291] Oh, really?
[292] Because I'm in the nerling study.
[293] I'm not sure that...
[294] Maybe you should lie.
[295] Yeah.
[296] I mean, it would be great.
[297] You shouldn't lie.
[298] But that would suck to get out of the study, too.
[299] Both things would suck.
[300] Yeah, yeah.
[301] I mean.
[302] Maybe they would allow it.
[303] Yeah, I mean, we'll see.
[304] We'll see.
[305] I mean, it's only something that would help you heal.
[306] Yeah, yeah, I know.
[307] I just know that like in a lot of studies, something like that, even they might not want to take on like the added risk.
[308] Understandably, also it would kind of mess up their control.
[309] Exactly.
[310] Like what happens, you juice somebody up with stem cells and does the brain pulsate more?
[311] Does the fibers come out more?
[312] Yeah, how does it interplay with any sort of device that someone has implanted?
[313] Yeah, I get it.
[314] I get it.
[315] Is there a hope in the future of utilizing this technology to help people regain movement?
[316] Yeah, yeah, that's one of the plans.
[317] I don't know if you've seen anything on it.
[318] Basically, they do something similar to what the stem cell, a lot of the stem cell research is.
[319] A lot of the stem cell stuff is, you know, implant stem cells above and below the level of injury, and those stem cells will migrate basically and create a bridge.
[320] Some of them have even talked about injecting right into the level of injury.
[321] So with the neuralink, the plan is to implant one in the brain and then implant one below the level of injury, and then the neuralinks will just talk right to each other.
[322] All the brain signals that it's picking up in the brain wherever it's implanted motor cortex in this point, in this scenario, would go straight to the other one, and it would send it right through your body like it should.
[323] And are they, do they have a plan on when to try this?
[324] They're already trying it in animals.
[325] They have one in a pig.
[326] You can watch the video of it where basically they have an implant in the pig's brain and an implant in the pig's spinal cord, I think in the thoracic section of the spinal cord.
[327] and they have been moving the pig's, like, legs on its own.
[328] The pig's not paralyzed or anything, but basically they'll, like, tell the pig, come to this section of, you know, they, like, grid off the floor, and they put food in a section of the grid, and they're, like, if you're okay with us testing on you, pig, come over here, basically, and the pig will go in there, and then they will take control of the pig's leg, and they will, like, start playing around with it, like, making the pig, yeah, so this right here.
[329] So all those movements right there, the pig's leg, are them.
[330] They're doing it.
[331] So, and this is just the beginning, obviously.
[332] So this is flexor movement it's saying, and the pig is lifting its leg up unconsciously.
[333] It's not doing it on its own.
[334] Nope, they're doing it all.
[335] Dude, how long before they can hijack people?
[336] How long before the CIA can hack into you?
[337] Yeah, I know, right?
[338] That's like, I mean, that is the ultimate fear of human beings becoming cyborgs, is that we're going to be subject to all the problems that our computers and our phones have with malware and spyware.
[339] Yeah.
[340] I mean, people ask me all the time if this thing can be hacked.
[341] And short answer is yes.
[342] But at this point, at least, hacking this wouldn't really do much.
[343] You might be able to see like some of the brain signals.
[344] You might be able to see some of the data that neuralinks collecting.
[345] And then you might be able to control my cursor on my screen and make me look at weird stuff.
[346] But that's about it.
[347] I guess you could go in and, like, look through my, like, messages, email, something like that.
[348] But I'd also have to be, like, connected already.
[349] So if I'm not connected to my computer or anything, you can't get in there on your own.
[350] So it would have to be a time when I am on it and you are able to hack it.
[351] You're giving it basically a guidebook on how to ruin your life.
[352] It's going to crank up the volume and put gay porn on full blast.
[353] Meetspin .com.
[354] Yeah, I mean, it...
[355] It is what it is.
[356] It is what it is.
[357] Yeah.
[358] I think if it happens, it happens.
[359] It's something that they had to tell me about before I got into the study.
[360] This is possible, but I'm not worried about it.
[361] What kind of a piece of shit would they be to hack your brain?
[362] Get the fuck out of here.
[363] Yeah, I know right.
[364] There's plenty of bankers out there stealing money.
[365] Go concentrating on them.
[366] You know, along that line, it's something I've thought a lot about with like doing interviews and stuff is like some of the people that I've done interviews with.
[367] I'm like, are they going to try to attack me to get to like Elon Musk or something?
[368] Are they going to say things about me or like, you know, try to do like a getcha on me, gotcha sort of thing?
[369] Yeah.
[370] And everyone that I've talked to about that, they were just like, they would have to be the scum of the earth to try to do that to you.
[371] But we'll see.
[372] Hasn't happened yet.
[373] Maybe someone will.
[374] Oh, there's some scummy people out there.
[375] They'll give it a go.
[376] Yeah.
[377] Especially if they think it can go viral.
[378] Yeah.
[379] Yeah.
[380] It's become in fashion to criticize Elon Musk.
[381] Yeah.
[382] I've already had some people who just the way that they're interviewing me is just so, I don't know, it gives me the heby -jeebies.
[383] Like I can tell they're trying to get me to say things.
[384] I'm just like, no. No. So what do you think they're trying to do?
[385] Are they just trying to intact?
[386] Well, see, here's a thing about interviewing that's kind of that a lot of people don't know.
[387] When you're used to talking to people, like I talk to a lot of people.
[388] I'm used to talk to people if I just meet them.
[389] This is me. If I was at a store buying food, this is me everywhere.
[390] I can be me. But it's because I'm used to it.
[391] But a lot of people when they sit down...
[392] They know they're going to be on camera.
[393] They've never been on camera before.
[394] And they get very nervous.
[395] And that's why I like to talk to people before the show, just kind of hang out a little, get you chilled out.
[396] I'm just a person.
[397] You're just a person.
[398] We're going to just talk.
[399] It's going to be easy, man. I'm your friend.
[400] We're going to have a good time.
[401] Some people don't want to do that.
[402] They want to do the opposite.
[403] So they want to sit there with a clipboard and they want to like look at you in a condescending way.
[404] And it's like a little bit of a power move.
[405] Yeah.
[406] And what they're trying to do is make salacious content.
[407] That's all they're trying to do.
[408] That's their job.
[409] Their job is different than a person who just wants to have a conversation and ask questions, which is my job.
[410] Their job is to make something dramatic happen that's going to be shared on TikTok.
[411] Yeah.
[412] Yeah.
[413] You know, they're barely in the news business anymore.
[414] What they're kind of in is the clip business.
[415] Yeah.
[416] Viral clip business.
[417] They're just, they're farming viral clips.
[418] So if they can say something ridiculous and maybe you'll say something back and that'll become the gotcha.
[419] Oh, he claps back.
[420] Yeah.
[421] Yeah.
[422] It's something like I'm not nervous talking to people.
[423] I've never have been.
[424] I've never had stage fright.
[425] I think people are people.
[426] I think I'm pretty good with people.
[427] I am not weird about interacting with others.
[428] I think it's because of my mom.
[429] My mom's like the friendliest person in the world.
[430] So I grew up just being able to walk up to someone on the street and start a conversation if I wanted to.
[431] Um, and so then I can obviously tell things when people are interviewing me, like what they're trying to get from me. Right.
[432] Um, like just the way that they ask questions, the tone of their voice.
[433] Yeah.
[434] Like, hey, I'm your friend.
[435] Like, open up to me. And it's like, it's like, it's a tear.
[436] I know.
[437] I know.
[438] Uh, it's just not, it's not great.
[439] Well, you know, that's the business they're in, you know?
[440] Yeah.
[441] If you work for a tire store, you're trying to sell tires.
[442] That's their business.
[443] You know, you need new tires.
[444] Do I really?
[445] You know, their business is talking shit and making things, you know, it's just, it's a bad format.
[446] Most of those media interviews are bad formats because it's a very limited amount of time and you have to have a clip that fits in between commercials.
[447] And also they're not free.
[448] They have executives and there's too many people that get in there and just the person talking to you should just be talking to you and they should have an understanding of what you do and how it happened and what this is all about.
[449] What is what this means for future people.
[450] You know, it shouldn't be like going after Elon Musk.
[451] Everyone's so goddamn political right now.
[452] It's so weird.
[453] Even making apolitical people political.
[454] It's just, so to connect you to that, it's just so stupid.
[455] What you are is, like I said, I think if there's a movie about the future, one of the very first people that has used...
[456] this kind of technology and we're learning that these people are getting better at it and they're and now with the use of AI I mean who knows what's going to be possible with you just in a few years yeah it's very exciting it is very exciting I know a lot of people are really nervous about it and understandably so I'm one of them I'm nervous yeah I've heard I've heard a little bit of what you've said about it and like I don't have like good arguments against it.
[457] I can come on here and be like, Joe, don't worry, man. Like, I'm here to help.
[458] Don't worry about it.
[459] I would say that's the computer in your brain talking to you, man. Let me let me into your computer, your phone.
[460] I'll show you there's no big deal.
[461] I'm your friend, Joe.
[462] No, but I get it.
[463] At the same time, the way I look at it is like how much it's going to be able to help people.
[464] How much is going to be able to help people like me at the beginning at least?
[465] Like, I know a lot of this is, like, down the road stuff.
[466] Like, you know, what it's going to do to normal people who get this.
[467] They're going to be able to be hacked or controlled or something.
[468] But for me, I think about it, like...
[469] how many people who are paralyzed don't have to be paralyzed anymore.
[470] How many people with disabilities, ALS, or Alzheimer's or any of these who are blind, how many people are going to be able to live their lives again?
[471] And that's my goal at the beginning.
[472] I know that I feel like people are going to look at me and say like I really need to be more concerned about a lot of the like things coming down the road.
[473] And it's something that I'm trying to think more about because at some point people are going to ask and I don't have good answers for it because all I'm thinking about is, you know, like I want to help people and I feel like this is going to help people.
[474] And that's what I'm focused on.
[475] Well, I think your perspective is probably the right one because no one knows what's coming.
[476] Yeah.
[477] No one.
[478] And you can be freaked out about it like I am.
[479] But I'm sometimes freaked out about it, but other times I'm just sort of resigned to the fact that this is just the existence that we find ourselves in.
[480] This is our timeline.
[481] We live in a very strange timeline.
[482] And it's happening at a very, very, very rapid rate.
[483] And no one has a map of the future.
[484] It's not possible.
[485] It's just all guess.
[486] It's completely...
[487] It is like an aunt trying to figure out how to operate an iPhone.
[488] It's not...
[489] We don't have it.
[490] Whatever it is, whatever it's going to be, it's going to be, and you're not going to stop it now.
[491] It's...
[492] We are a runaway train.
[493] Yeah.
[494] Let's just hope we're going to a cool spot.
[495] Yeah, right.
[496] I mean, you look at a hundred years ago, like, there's no way they could have imagined what our world will be like now.
[497] No. And I have a feeling the next five to ten years is going to be a lot bigger than that.
[498] Yeah, I mean, exponential growth.
[499] Yeah.
[500] So.
[501] Well, it's just once this stuff goes live, it's just, it's going to be really weird.
[502] It's going to be really weird.
[503] But along the way, we're going to solve a lot of the problems that, I mean, look, if I have, I've had three knee surgeries, two ACL reconstructions.
[504] If I lived 100 years ago, I'd be a cripple.
[505] Yeah.
[506] You know, just how it is.
[507] My knees would be destroyed.
[508] I wouldn't be able to walk good.
[509] And now I can do anything.
[510] That's just medical technology and understanding of the human body.
[511] Implementation of this kind of device that can allow you to move your body.
[512] And can, as you were saying earlier, you can bring back eyesight to some people.
[513] This is something that they really are hopeful for.
[514] Have they done any of that on animals yet?
[515] I'm not sure.
[516] I know that what the plan is.
[517] Like they did a talk about it a while ago on like a show and tell.
[518] They basically show how...
[519] Like, how the neuralink works in my brain would be very, very similar.
[520] You would just take, you would, you would like activate certain parts of the brain or behind the eye, the part of the brain, the part of the eye that dictate sight and stuff.
[521] You would activate certain things in order to display what's going on around the world to someone, to the back of someone's eye, to their retina, whatever it is.
[522] I don't know much about it.
[523] But they have done it.
[524] Oh, they did it with monkeys, actually.
[525] Yeah.
[526] So there's a video of them lighting up parts of a screen.
[527] And they have like basically an eye tracker in the monkey.
[528] And so the monkey will look to different parts of the screen and like wherever they've lit up on the brain basically.
[529] So whatever is going, whatever implant they have in the brain, they'll like light up somewhere on the brain.
[530] And then they'll light it up on the screen and the monkey will look there.
[531] And then at some point, they stop lighting it up on the screen and they're just lighting it up in the monkey's brain and the monkey still looks there.
[532] So, yeah, so they know that they can do these sorts of things.
[533] Yeah, it's amazing.
[534] I know that there are other companies that have done something similar to this too, like giving people, like helping people with their eyesight.
[535] I know one of them like went under, which was...
[536] It was just a wild story, basically about a company who had implanted things in people and the company went under, and then the people in the study were like, well, what do we do now?
[537] And they didn't know if they were just going to continue.
[538] That's one of the things about like...
[539] Whoa.
[540] Yeah.
[541] Yeah.
[542] I should mention that the blind site implant is already working in monkeys.
[543] The resolution will be low at first, like early Nintendo graphics, but ultimately may exceed normal human vision.
[544] Holy shit.
[545] Also, no monkey has died or been seriously injured by a neuralink device.
[546] Look at March 21st.
[547] By a neuralink device.
[548] Right.
[549] But they did have to kill the monkeys that they originally did studies on, right?
[550] Yeah.
[551] Do you know much about, like, studying with animals and stuff like that?
[552] Yeah, you have to kill them to find out what damage you've done.
[553] Yeah, exactly.
[554] Well, like, basically all animals that are in studies at some point get...
[555] I think they have a really terrible term for it.
[556] I think they call it sacrifice.
[557] So they sacrifice.
[558] That's Satanics.
[559] Yeah.
[560] It's crazy.
[561] Yeah, I know, right.
[562] I know, right.
[563] In this day and age, there's a lot of fucking fear of Satan.
[564] It's not a great, it's not a great term for their cause.
[565] We could have worked on that one.
[566] Yeah, I know, right?
[567] Just put a little bit more thought into it.
[568] Um, so yeah, they do that.
[569] They have to, like you said, learn something from the monkeys, um, from the animals that they're testing on.
[570] So some of them they will, um, you know, let live longer.
[571] Some of them they'll, uh, implant something in and then sacrifice almost immediately to see, because they have to know what it's doing short term, medium term, long term.
[572] Um, so basically all animals and all animal testing get sacrificed at some point.
[573] I don't know how true that is because obviously a lot of them, um, once they're done with the study that they're in, they let go live if it wasn't too invasive.
[574] If they don't need to like study any part of them, they're going to be killed for.
[575] But if you're going to study the brain?
[576] If you're going to study the brain, then there's really no other way.
[577] You've got to get it now.
[578] And then there was the whole, like, report that came out about all the terrible things that NeurLink was doing to monkeys.
[579] I've talked to the people.
[580] I got to meet them, the people who were working directly with the monkeys.
[581] Those monkeys have the best animal facility in the world.
[582] Someone, like, came in and built it, like, basically...
[583] they're going around now that person is going around and changing how other um like labs treat their monkeys like for the better so they're going they're like revolutionizing the world of like animal testing basically so neuralink treats their animals better than anywhere else and then the report that came out and said like all these terrible things that were happening with animals it's it's skewed because all the things that they brought up were just it was all of the bad like basically anything bad that happens to the monkey has to be or any of the animals has to be reported and gets reported in this like you know XYZ format of this is what's going on with the monkey this is what happens what we think happened we had to kill the monkey yes or no But none of the other things get reported at all.
[584] None of the time between.
[585] Like if it's five years of the monkey and one bad thing happens, then there's a report about that one bad thing happened to the monkey.
[586] And you compile all of that and look at all these terrible things that are going on with the monkeys.
[587] But it's just not really true.
[588] Interesting.
[589] Yeah.
[590] Well, it's a tough one because some people don't think any studies should go on with animals at all.
[591] Yeah, yeah.
[592] And so for them, everything that happens to an animal in captivity for a scientific purpose is evil.
[593] You know, I get it.
[594] I get it from their perspective.
[595] Yeah, I get it.
[596] You know, they call us speciesists because we're willing to do things to monkeys that we, you know, aren't there like a lot of evil people in the world we could practice on?
[597] You know?
[598] I mean, I don't want to give anyone any ideas.
[599] Right.
[600] That's a terrible thing to say.
[601] Should do that.
[602] But an innocent monkey, it's fine.
[603] Very weird.
[604] I mean, monkeys do terrible things to us, too.
[605] And they do provoked, you know, or if they live in urban neighborhoods where they rely on tourists and they steal their phones for food and attack people.
[606] Yeah, they're fuckers.
[607] Yeah.
[608] They can be fuckers.
[609] I mean, I saw a story of a monkey who basically tore some kids face off.
[610] I think he was, like, outside of the village or in his village.
[611] And the whole story was about how they were doing reconstructive surgery on the kid and, like, making him look a bit more normal again, but...
[612] That's terrifying.
[613] Monkeys are unbelievably strong.
[614] There's a video of a guy sitting on the ground cross -legged and a monkey hops on his shoulder.
[615] And then the guy's like thinking it's cute and smiley.
[616] And then the monkey just decides to take a massive chunk of his scalp off.
[617] Just bites down on his head and just takes like a football -sized chunk of scalp off this dude's head.
[618] It's horrible.
[619] Just decided for no reason, unprovoked.
[620] Wow.
[621] You know, monkey lives in a rough neighborhood.
[622] He had a hard life.
[623] He's not out there just, you know, picking fruit.
[624] This is it.
[625] So this dude is just sitting here with his monkey is like sitting on his lap and he's like talking to the monkey.
[626] He's like, hello, Mr. Monkey.
[627] Saying something bad about his home.
[628] Look, he doesn't seem like he's bummed out about the monkey.
[629] Now watch what the monkey does.
[630] Oh, geez.
[631] Yo.
[632] Ouchie Wawa.
[633] Is that his skull?
[634] I hope that was, okay, that was his skull.
[635] No, not that.
[636] I mean, that would have to be a chunk of skull, right?
[637] Yeah.
[638] But it was the skin.
[639] It pulled, I mean, that's gone.
[640] That's gone forever.
[641] Yikes.
[642] No thanks.
[643] Ouch.
[644] Yeah.
[645] Gross.
[646] But, you know, a monkey again, probably had a hard life.
[647] Yeah.
[648] You know?
[649] We need his monkey life reform.
[650] Yeah, I know, right.
[651] I think when we're looking at these kind of scientific experiments on animals, a lot of people are going to have a problem with.
[652] But I wonder with new technology, if that's even going to be necessary anymore, in the future, particularly with the leaps that are going to be made with AI, I wonder if they're going to be able to just be able to map out a study.
[653] you know, like understand the interactions between human beings and these devices and be able to map out the possibilities and probabilities without having to do that.
[654] Yeah, you would think so.
[655] Yeah, but who knows?
[656] Yeah, it makes sense.
[657] It makes sense.
[658] I feel like at the beginning, they would probably need to do like that along with a study on a human.
[659] So they might run, say, simulations a million times on what, you know, an AI simulation on what, how this would interact with a human, but then they would have to go in and do it to see how true the, like, simulations are.
[660] And then depending on how accurate they are, then maybe they could just go fully to that.
[661] But if it ends up being different, then...
[662] Yeah, I have a feeling they're going to be able to replace parts with artificial parts too, like the eyeball itself.
[663] I was just thinking about that the other day, like how complex, like look how small these little cameras are on our phones, little tiny -ass cameras, but one of these can do 100X zoom.
[664] Yeah.
[665] You know, one of these is 200 megapixels, this little tiny thin thing.
[666] Like, what's to say that they wouldn't be able to come up with something that works way better than the human eye?
[667] Well, you could zoom in.
[668] Yeah.
[669] Just like a phone.
[670] Just like zoom into something.
[671] But have like a real optical zoom.
[672] Yeah.
[673] I just hope they don't give them red retinas.
[674] That would be creepy.
[675] Yeah, I know.
[676] Terminator style.
[677] Yeah, I know.
[678] Yeah.
[679] It's just seems like they could do better.
[680] Yeah, it would be very weird talking to someone with two fake eyes.
[681] It'd be weird if you couldn't even tell.
[682] You probably wouldn't trust them anymore, right?
[683] Because you kind of like look into someone's eyes, you know, you find out if they're cool.
[684] Like if you're just looking into these lenses, you're like, are you even in there anymore?
[685] Yeah.
[686] I'm just trusting that you're still there.
[687] It's like talking to someone with sunglasses on forever.
[688] You know, you net...
[689] What's going on there, man?
[690] Yeah.
[691] Yeah, what are you looking at?
[692] That is a weird thing that we look through the eye.
[693] You know, it's the old expression.
[694] The windows to the soul.
[695] Yeah.
[696] I mean, you can tell, right?
[697] Yeah.
[698] Sometimes looking at people, you look in their eyes.
[699] You're like, I want to get too close to you.
[700] Like you got crazy eyes or something.
[701] Yeah.
[702] Yeah.
[703] Something like that.
[704] Yeah.
[705] Like I'm saying, maybe at some point you wouldn't even be able to tell.
[706] Which is also something to think about.
[707] Who?
[708] Like if you wouldn't be able to tell someone had robot eyes, like just looked like a normal person.
[709] Right.
[710] But that's a real part of how we interact with each other.
[711] It's like facial expressions and like figuring each other out just by like how your eyes are looking at me. Oh, man. Well, it's not like you distrust someone because they have, like, a glass eye or something.
[712] No, not a glass eye, but boot, roop.
[713] Yeah.
[714] If they got like, little red lights moving around inside their head.
[715] Maybe that's the only way it works.
[716] It has to make a little noise.
[717] Especially when you're alone with them, you know, me, Oh, man. All through the night.
[718] Little movements and rapid eye movement in the way.
[719] Yeah.
[720] Why you're sleeping?
[721] Yeah.
[722] I mean, I'm sure they've worked.
[723] Wasn't there some sort of a study where they were trying to develop an artificial eye?
[724] I got a, whether or not this is, I'm trying to find out how real it is.
[725] I got one guy who has a 3D printed eye.
[726] 3D printed eye.
[727] It's got a camera in it or something.
[728] I'm trying to find it.
[729] Oh, doesn't it hook up to his tooth?
[730] I don't, that there's two things I'm seeing here.
[731] They've figured out a way to allow people to see things through their teeth.
[732] Yeah, I've seen that.
[733] I don't get it.
[734] I don't get it.
[735] I'm not going to get that one.
[736] There's not enough time in the world for me to figure that out.
[737] Thank God for smart people.
[738] I know, right.
[739] I mean, how are they getting it through the tooth?
[740] Both of them, the 3D printed eye.
[741] Here's the one guy, first guy.
[742] He's a director, shot himself in the eye on accident.
[743] Oh, yo.
[744] I guess he's got a camera in there, it says.
[745] And he sees through the camera?
[746] That I was trying to get to, yeah, it's got a transmitter.
[747] I don't know if it's going through his brain, but he can see it on a camera.
[748] Oh, so you can see it on a phone.
[749] Yeah.
[750] That's kind of weird.
[751] Maybe that's the first step that they need to.
[752] That guy would make the weirdest POV porn.
[753] Is this made in like the 90s?
[754] It's not new.
[755] Yes, 12 years ago.
[756] Uh -huh.
[757] So it looks like you're playing that on a Game Boy.
[758] Yeah.
[759] It's not a Game Boy, but it's some sort of a proprietary little electronic video player.
[760] Hmm.
[761] Yeah.
[762] Amazing times.
[763] Yeah.
[764] So what is next in terms of like how long does this study that you're on?
[765] A blind woman sees with tooth in eye surgery.
[766] Doctors in Florida restore a woman's site by implanting a tooth in her eye.
[767] That's different.
[768] No, but I think that's how they do it.
[769] Okay.
[770] That is the thing.
[771] That's, I was saying like through your teeth, but I mean, that's, that is how they do it.
[772] A team of specialists, the University of Miami Miller School of Medicine announced Wednesday that they're the first surgeons in the United States to restore a person's site by using a tooth.
[773] The procedure is formally called modified osteo -odontocera -top...
[774] Keratah keratop prosthesis, sorry.
[775] Sharon K. Thornton, 60 went blind nine years ago from a rare disorder called Stephen Johnson syndrome.
[776] The disorder left the surface of her eyes so severely scarred, she was legally blind, but doctors determined that the inside of her eyes were still functional.
[777] enough that she might one day see with the help of this thing.
[778] This is a patient where the surface of the eye was totally damaged, no wetness, no tears, Dr. Victor L. Perez, the ophthalmologist at the Bascombe Palmer Eye Institute at the University of Miami, who operated on Thornton.
[779] So we kind of recreate the environment of the mouth in the eye.
[780] What?
[781] I don't get that.
[782] Three -phase operations started with the University of Miami dentist, Dr. Yo Sawarti, Sawatari, who removed the tooth from Thornton's mouth and prepared an implant of her own dental tissue for her most severely damaged eye.
[783] The tissue would be used to make a new cornea to replace the damaged one.
[784] The doctors then remove a section of Thornton's cheek that would become the soft mucus tissue around her pupil.
[785] Whoa.
[786] Finally, Perez and his team implanted the modified tooth, which had a hole drilled through the center to support a prosthetic lens.
[787] We used that tooth as a platform to put the optical cylinder into the eye, explained Perez.
[788] Perez said doctors often use less risky and less invasive techniques to replace corneous, but the damage for Thornton's Stephen Johnson syndrome ruled those out.
[789] Whoa.
[790] Yeah.
[791] Using a tooth might sound strange, but it also offers an advantage because doctors used Thornton's own cheek and tooth tissue.
[792] She faces less risk that her immune system will attack the tooth and reject the transplant.
[793] Patients getting a cordia transplant from a deceased donor, on the other hand, face chances that their immune system will reject the new tissue.
[794] Wow.
[795] Yeah.
[796] Wow.
[797] Yeah, for some reason, I thought they were using that tooth to, like, I don't know, use it as a replacement for, like, her vision in some way, but it's literally just a placeholder for, like, you know, different things, like the tissue and different places to, like they said, hold that lens and stuff.
[798] That makes more sense.
[799] Yeah, I thought it was that, too.
[800] I thought they were seeing through the teeth.
[801] Yeah, yeah.
[802] I was like, that doesn't, I don't get that.
[803] No, that makes more sad.
[804] Like, why can't we see through our teeth all the time?
[805] Be looking at what's going on in my mouth.
[806] Right.
[807] Yeah.
[808] All this stuff is, it's just mind -blowing to imagine where this is going to be in 100 years.
[809] Yeah.
[810] And with you, do you have the, like, if they start doing the range of motion studies or the being able to recreate motion or restore motion, are you going to be available for those studies?
[811] Can you do that too?
[812] Are you only...
[813] like locked into this one study yeah i don't know uh i imagine i'm locked into this uh for now at least but at the same time um i'm not sure i'm really not sure you would have to do it with someone who already has the implant in their brain so i don't know if it'll be a separate neuralink that they would need um like a different one specifically for um like the two implants interacting together.
[814] I don't see why that would be the case.
[815] Just like the same thing with people who they're going to have to test to see if the surgery to replace a neuralink is safe at some point.
[816] They're going to go through a whole thing.
[817] So they're going to have to do it on people who already have it in.
[818] So I imagine like that sort of study might be something I would be involved in if they're planning on implanting one in someone's spinal cord and then seeing how they interact and seeing if it works.
[819] I don't see why I couldn't be in that.
[820] But we'll see.
[821] It's kind of a long way off, I think.
[822] How big is the neuralink implant?
[823] It's about the size of a quarter.
[824] It's thicker than a quarter.
[825] I don't know, maybe half an inch, something like that thick.
[826] And does it, it's on the surface?
[827] Yeah, it's implanted on my skull.
[828] So they cut out a chunk of my skull.
[829] I think it's called a craniacomy.
[830] And then they left that chunk out and just replaced it with the neuralink.
[831] Do they take that chunk and, like, put it in the freezer so they could put it back in you someday?
[832] Yeah, I'm not sure.
[833] I don't think so.
[834] I, oh, talking about it.
[835] Yeah, talking about it afterwards.
[836] That's it.
[837] Yep.
[838] Yo, that's in your head.
[839] I was talking about it with my buddy afterwards.
[840] And I was like, I should have asked them for my chunk of skull.
[841] That would have been sweet.
[842] Yeah.
[843] Yeah, I don't think they're allowed to give that to people.
[844] Yeah.
[845] I think that's like bio waste or something like that.
[846] It's bio has, I know.
[847] It should, it definitely should be.
[848] Yeah.
[849] They give people their testicles back.
[850] Right, but it has to be in like formaldehyde or something.
[851] Okay.
[852] So take your skull and put it into formaldehyde.
[853] Yeah, that's fine.
[854] As long as I can have it.
[855] Um, how many versions did they go through before they got to the one when they were willing to do it on people?
[856] Uh, a lot.
[857] I saw, like, from their very first idea of Neurrelink through this one, I don't know.
[858] I don't know how, exactly how many there were.
[859] I would say at least one or two dozen different, like, iterations.
[860] Hmm.
[861] Yeah, and then, like, the version I have is...
[862] like thousands in like the thousand or two thousand iteration of um this one so like they're constantly changing stuff so like even the next person that gets it they've probably made i don't know a thousand more modifications to it um little things just like if they've seen certain certain things in my implant they can improve on Obviously, they're going to change how the threads work.
[863] They're going to add more electrodes.
[864] They're going to maybe update the battery.
[865] They might update a lot of things.
[866] They're looking at updating what signal it uses instead of Bluetooth.
[867] They're looking at different things like that.
[868] So the next one that comes in is probably going to be much different.
[869] Maybe the same design.
[870] Maybe they found a better design.
[871] I don't know.
[872] Wow.
[873] And I know in the future they've talked about putting this into people that don't have any issues medically.
[874] What are they planning on doing?
[875] Like, how are they planning on that?
[876] Do you know?
[877] What do you mean how?
[878] Like, in terms of like, is that going to just be offered for, you're going to get, what are the long term goals?
[879] Is it to get the internet on that?
[880] Is there the people communicate telepathically?
[881] Is it going to be a slow build up to the idea that everyone is going to want to get one of these things?
[882] I think once it's proved, so like this, this studies to prove whether or not it's safe and if it works, basically.
[883] I think once that's proven, then they're going to get into a lot more of what it's actually capable of.
[884] And then once it's released to the public, I think people are going to rush to get it honestly.
[885] At least a group of people who have been following it at the very least.
[886] Because once we know that it's safe, then that's one of the big things that people are going to, like, once that's lifted, once you're like, okay, it's safe.
[887] Now we can go through and start talking about being able to communicate with people and being able to, you know, possibly download information or have it be available to you.
[888] um using AI and stuff like that i'm not sure if that's going to happen i don't see why it's not possible at the very least and then they i know nerling's talking about opening up a clinic in austin basically where you would go in and get a surgery and like walk out um so it's not like like my surgery was I don't want to say not invasive because obviously they did brain surgery.
[889] But it was, they were expecting it to be, you know, something like three to six hours.
[890] And my surgery took under two hours.
[891] It went super, super fast.
[892] There were no complications at all.
[893] It was not like obviously invasive in the brain, but there was no damage done really.
[894] So, and this was the very first time.
[895] So once they get this even better, even more tuned in than I imagine people go into this clinic and go in and come out in a few hours with an urlink and then they can chat with all their friends online or something else.
[896] Jesus.
[897] It'd be pretty cool.
[898] It'd be pretty cool.
[899] Again, I'm not, I'm not here to talk about like the ethical ramifications of that or like how.
[900] How it's fun to think about, like the things that might go wrong or could go wrong.
[901] And it's probably something that people much smarter than me should think about, whether or not it should be done.
[902] But I think there are so many things that you could do with it.
[903] I think it's going to be done no matter what.
[904] And if it's not done by neuralink, it's going to be done by someone in another country.
[905] It's going to be done.
[906] Technology always moves forward.
[907] It never stops over concerns of what could possibly go wrong, hence the nuclear bomb.
[908] Yeah.
[909] It's not going to stop.
[910] Yeah.
[911] It's just not what we do.
[912] We always try to come up with greater things.
[913] And if someone does figure out a way to connect human beings to some form of wireless internet or wireless data or some completely new thing, instead of thinking it as the internet, as we know it, being these devices that go to websites, it might be a completely different invention that uses a completely different type of technology to sync...
[914] all the information and all the minds in the world together.
[915] It might not be as dopey as going to a website.
[916] Like going to a website is probably like an archaic way to do it.
[917] Yeah.
[918] You know?
[919] It'll be like the cloud or the Metaverse or something.
[920] You can just hop in and everyone will be there.
[921] You can go chat with whoever you want around the world.
[922] And they can just upgrade your operating system and make you woke.
[923] Yeah.
[924] Exactly, right?
[925] You sign up for the wrong one.
[926] You know, you got some way crazy ideas.
[927] Propaganda will take new leaps and bounds.
[928] Right, but then who's running it?
[929] Is it one person and everybody else is a robot?
[930] Like, that doesn't make any sense.
[931] What they'll try.
[932] I'm sure someone's going to want to run it all.
[933] Someone is going to want to run it.
[934] Mm -hmm.
[935] Yeah.
[936] It's going to need to be, hopefully they, by that point, they will regulate it.
[937] But as we've seen with like, you know, things like AI art even, they're trying to catch up with that.
[938] It's like, oh, should we, should we have like thought about this before all this was released?
[939] And like, no, government will figure it out.
[940] Yeah, good luck with that.
[941] Right.
[942] Yeah.
[943] Well, they're able to scour the internet for every artist's work and then sort of take pieces of that and create art. And these artists are like, hey, you know, that took me fucking forever to paint that.
[944] And you just stole it and did a version of it in 13 seconds.
[945] Yeah.
[946] Weird.
[947] Yeah.
[948] And that's just one problem.
[949] Another problem is deep fakes and songs.
[950] They made a Drake song that became a hit, and Drake had nothing to do with it.
[951] It's not that far away from it being out of the barn where you're not going to be able to ever stop...
[952] You're going to be able to do whatever you want in terms of, like, creating videos, audios, and it'll look indistinguishable from a real video, real audio.
[953] Yeah.
[954] They're already going to take this podcast and translate it into different languages without me being able to speak them just through AI.
[955] Yeah, I mean, I think they did the same thing with the deep fake like you were just saying.
[956] I think they did something with like Trump recently, where it was like a deep fake at Trump.
[957] And after a while, he had to be like, hey, guys, that wasn't me?
[958] Wasn't there a football player that was saying some wild shit that turned out to be fake or a basketball player?
[959] Yeah, that too.
[960] Did you hear about this, Jamie?
[961] Depends on exactly what you're talking about.
[962] But there's a bunch of fake press conferences that go violent.
[963] Yes, that's what I'm talking about.
[964] Yeah, it's like a thing someone's doing.
[965] Yeah, but apparently it was just barely wacky enough for people to go that looks fake.
[966] But you have to be very sophisticated.
[967] If you saw this, I mean, we're getting used to looking for things being fake, whereas 20 years ago, you would say that's real.
[968] Yeah.
[969] I see it.
[970] It's a video.
[971] It's real.
[972] It's something that I was actually just talking with my buddy about the other day.
[973] I think it's going to be something similar to, you know, how like we get emails from Nigerian princes.
[974] And we're like, yeah, like grandma, don't open that.
[975] Don't send them money.
[976] It's not real.
[977] I think it's going to be something that people are able to do like the next generations where they look at something online and they're like, oh, yeah, that's AI.
[978] Oh, yeah, that's fake.
[979] Yeah, I think you're right.
[980] Yeah, they're just, they're going to grow up with it, so they're going to be able to figure out.
[981] But maybe not.
[982] This stuff looks so real that I don't know, but maybe they're going to have to be required to do like watermarks or something on it every time.
[983] I don't think they're going to be able to stop it.
[984] I think we're just going to get to a real weird, blurry place.
[985] I think the one thing that might help.
[986] And this sounds crazy.
[987] But I think ultimately what technology does is closes, it makes things more accessible.
[988] It gets you more information.
[989] It connects people more.
[990] It's with translation.
[991] It's connecting people from different cultures and different countries more.
[992] I think ultimately what it's going to do is it's going to be some sort of a mind interface.
[993] I don't think it's going to be as simple as language.
[994] I think it's going to be a next level mind interface.
[995] And if it's something through a technology akin to neuralink or maybe future versions of neuralink, I think we're going to be able to know what someone's actually thinking.
[996] I think you're not going to be able to lie anymore, is what I'm saying.
[997] I don't think lying is going to be possible 100 years from now, which would be a really good thing.
[998] And if you're a person right now that lives your life without lying, you know this.
[999] This is way better.
[1000] As a person who used to lie and doesn't lie ever now, I'll tell you right now, it's great.
[1001] I love it.
[1002] It's a good thing to not lie.
[1003] And if you live your life in this manner where there cannot be deception, how much more would we get done?
[1004] How much more would we understand each other in relationships?
[1005] And if you're bullshitting, you'll understand that you're bullshitting by the way another person sees your thoughts and then you'll be forced to handle those and go, you know what?
[1006] I'm trying to put this off on other people and it's really me. I'm the problem.
[1007] Yeah.
[1008] You'll be able to see it.
[1009] Everyone will see reality instead of these sort of manufactured narratives that people have with this very selective view of memory and their thoughts of the past.
[1010] And, you know, my boss did me wrong.
[1011] No, you were a fuck up.
[1012] You showed up late every day.
[1013] Like, you know, they fucking hated me. No, you were super insecure and real shitty around people.
[1014] You know, it's like you'll see, we'll, well, we'll, we'll be able to solve a lot of our social issues that seem insurmountable because of poor communication, but poor understanding and the lack of honesty, a lack of real honest conversations instead of just people trying to win arguments.
[1015] Yeah, yeah, that'd be great until people realize that, you know, maybe you don't need to lie exactly.
[1016] Maybe you can find ways to work around having to lie with this thing.
[1017] If you can't lie anymore, if you're not allowed to, I mean, people find ways to kind of sort of lie all the time.
[1018] And then also, if you can hack it and then you're able to lie and no one else is, then that becomes kind of an issue too.
[1019] If in some way you are able to, like, jail break your neuralink, so you can't lie anymore, and then you're the only one lying, everyone's going to believe you.
[1020] They think that you can't lie, and then that brings up a whole new world of problems.
[1021] In my eyes, you're seeing right into the thoughts.
[1022] Oh, I see.
[1023] I don't think you have a chance to lie.
[1024] I don't think there's any, there's none that doesn't exist anymore.
[1025] I think it goes away.
[1026] And hence, leaders go away.
[1027] That's going to be a real problem.
[1028] We're going to have to have...
[1029] actual understanding of all the different processes that are in play, whether it's environment or resources or, you know, inter -country conflicts, whatever the fuck is going on, we're going to have a, we have to have a real understanding of it without politicians bullshitting us as to why we're going to do something.
[1030] That won't exist anymore.
[1031] That would be wild.
[1032] They would be the ones that would resist it the most.
[1033] They were like, wait, this dangerous mind -reading technology.
[1034] Like, you fucking Nancy Pelosi to have a press conference.
[1035] I mean, I just think if something like that ever came about, they would never let it happen.
[1036] I don't think they have a choice.
[1037] Because China will do it, Russia will do it, everyone will do it.
[1038] Someone's going to do it.
[1039] All these eggheads out there that are willing to push that, but they're not going to listen to the government.
[1040] Shut the fuck up.
[1041] The government is just a bunch of people.
[1042] The super nerds out there are the ones who are really in charge of this stuff because even we're seeing this with technology and some of these hearings on AI, the people that are asking the questions don't know what the fuck is going on.
[1043] Yeah.
[1044] You know, and I'm sure you saw that with some of the Facebook hearings and some of the other hearings.
[1045] The people that are actually asking about the technology, how much time do you have to get into the understanding of this?
[1046] How much time between worrying about water rights in your district and this and all these other problems that you have as a politician?
[1047] Yeah.
[1048] How much time are you actually spending trying to figure out how social media works?
[1049] Probably none.
[1050] They just have AIDS that are giving them all this stuff.
[1051] That's why they have pieces of paper and they're looking down with their reading glasses.
[1052] Now, Mr. Zuckerberg, my phone doesn't go to Google, right?
[1053] Why is that?
[1054] He's like grandpas who argue on Facebook.
[1055] They're not going to be the people that control AI and they're not going to be the people that are going to be able to figure out how to stop mind reading technology.
[1056] I think when mind reading technology comes, it's going to come so fast that it's going to be just like all these other things like the internet.
[1057] It came so fast they couldn't control it.
[1058] Because if you looked at the internet, if you looked at...
[1059] What the internet has done for like a distrust in mainstream media, distrust in politicians, exposing corruption, all the different things that we know about now that are a fact that just 20 years ago you would have thought been crazy conspiracy talk.
[1060] If they knew that that was going to happen and make life so much more difficult for them, they would have regulated the internet from the jump.
[1061] They would have stopped, stepped in, took over like China did.
[1062] took over like North Korea did, and you would get their version of the internet forever, and that's it, and there's no growth, and they'll silence dissidents.
[1063] And that's how they would have done it if they had ever known that it was going to be what it is now.
[1064] I think that's exactly what's going to happen with mind -reading software and mind -reading technology.
[1065] I think it's going to happen.
[1066] They're going to be...
[1067] Oh, Jesus Christ.
[1068] I don't think...
[1069] You know, and also...
[1070] Look, they're just human beings too.
[1071] They're going to want that.
[1072] Yeah.
[1073] If they find out there's a technology that allows you to communicate with people in a completely new way and it's much more fulfilling and we understand each other much better and we really do realize that we are all one.
[1074] Imagine we can communicate with this technology and it ends war overnight.
[1075] Yeah.
[1076] It makes war literally impossible.
[1077] You realize that these people that you're about to bomb are you and that we're all the same thing.
[1078] We're all one consciousness experiencing itself through different bodies and different lives and different experiences and different genes and different parts of the world.
[1079] But we're all genuinely the same thing.
[1080] Yeah.
[1081] Yeah, I don't know.
[1082] brings up a lot of questions like where we would go from there though like how it's going to change that's when the aliens land the aliens land we figured out oh finally oh man yeah we were waiting if that's what it takes to bring aliens down then i'm all for it if that's if that's what it takes to really get us to be face to face the only thing i keep telling my buddy is like i i am all down for the whole like aliens come in, us interacting with them and everything, as long as they're not the mantids.
[1083] If they're the mantis people, I don't want anything to do with them.
[1084] I think that, you know, I just don't want it.
[1085] I'm with you, bro.
[1086] Yeah.
[1087] Fuck the mantis people.
[1088] Can you imagine mantises were like the size of a dog?
[1089] We'd be so fucked.
[1090] Yeah.
[1091] We'd be so fucked.
[1092] Yeah.
[1093] One of the most gangster videos I've ever seen online is like a gecko, and the gecko's trying to eat the mantis.
[1094] And the gecko walks up to the mantis and tries to get it, and the mantis is like, not today, bitch, I'm gonna eat you.
[1095] Oh my gosh.
[1096] And the gecko's like, what is happening?
[1097] You can see it look at its face, it's like so confused.
[1098] And it's got its claws, these fucking, these giant things wrapped around and controlling, and it just starts eating its face.
[1099] Yep, mantises are like insects themselves like you really get up close to an insect you're like that thing is ugly I do not like it one bit now imagine that and the things I've heard about the mantis is they're not the size of a dog they're like the size of like multiple people and no things like absolutely not the mantis aliens I'm not too familiar with I've seen like a couple things online how many people have seen the mantis aliens Yeah, I know of one story where there was a hunter just walking around and it got like dark over him or something and he looked up and there was just like a ship over him and he looked through his scope and he looked right into some like mantis people.
[1100] And I'm not okay with that.
[1101] Like that's the one alien story I think I'll stay far away from and hope it's something else.
[1102] Well, you got to think that.
[1103] insects have some kind of bizarre intelligence because that's if you've ever seen leaf cutter ant colonies when they they pour the cement in them and you realize like how sophisticated they are like how do you guys do this like how do you figure this out there's like they have channels where the air can pass through that so they can ferment leaves so they have like a fermentation factory inside their ant colony and the colony is huge yeah It's so big.
[1104] And you're like, you little tiny fuckers built a city underground right here.
[1105] There's got to be some sort of intelligence.
[1106] Now, if ants evolved to the point where they develop that kind of intelligence, who's to say that in a different environment where ants have more access to food, more access to resources, and more competition that they don't evolve to the point where that intelligence, it keeps getting scaled up.
[1107] And they get to like a human, human level intelligence from an insect or beyond.
[1108] Why not?
[1109] Yeah, they just need some psychedelics or something to really get that brain to grow.
[1110] Or a neuralink.
[1111] Yeah, right.
[1112] That's what I have a feeling.
[1113] I have a feeling that in the future, everyone's going to be some sort of a cyborg and everyone else is going to be artificial.
[1114] that there'll be complete life forms that were developed just with computers.
[1115] Just like computers, technology, whatever form of chips and they'll put together things that are more intelligent than us, can communicate with us, can work with us, but that's going to be one of those things.
[1116] It's not going to be one of us, and that'll be a different life form that exists alongside with us.
[1117] But I don't think there'll be very many people like me. No chip, no nothing, just a person.
[1118] Like, what is that moron doing?
[1119] You're running around with no chip?
[1120] You know, I think in the future, it's going to be everyone's going to have something that enhances them.
[1121] We already do with our phones.
[1122] Yeah.
[1123] You know?
[1124] Yeah.
[1125] It's going to be something like beyond that, where it's going to be so compelling that everyone's going to want to do it.
[1126] So you're not going to get it if it comes out?
[1127] I'm not saying we're not going to get it.
[1128] I might get it.
[1129] I might have to.
[1130] I don't want to be alone.
[1131] Yeah.
[1132] I'm going to be the only person who can't read minds.
[1133] I probably wouldn't want to be the first adopter.
[1134] Yeah.
[1135] Yeah.
[1136] I want to wait a little bit.
[1137] Yeah.
[1138] That was an argument that I had with doing this was, do I really want to be the first?
[1139] I mean, who knows what kind of problems is going to be, but...
[1140] But for a guy like you, I would say, like, they're pretty sure it works.
[1141] And they were right.
[1142] Yeah.
[1143] You know, and how cool was it the first day to be able to play video games?
[1144] Yeah, it was awesome.
[1145] What did you play?
[1146] Civilization 6.
[1147] I don't know.
[1148] I don't think you've heard of it.
[1149] It's a massive game.
[1150] It's something I've been wanting to play for a long time.
[1151] I was able to kind of sort of play it with some different assistive technology.
[1152] over the last few years, but not really.
[1153] And I played it like all night.
[1154] I didn't sleep.
[1155] It was freaking awesome.
[1156] Man, I just love, I mean, I grew up being a gamer.
[1157] I grew up in kind of this age.
[1158] So the last eight years, I've watched all of my friends play games that I've wanted to play.
[1159] And the fact that I might be able to play some of them, like some of them are still too far out of reach for the NeurLink at this point, but not for much longer.
[1160] In the next few years, I think I'll be able to play anything anyone else plays.
[1161] Halo.
[1162] I love Halo.
[1163] I'm a big Halo bad.
[1164] You're going to be able to play that?
[1165] Yeah, I hope so.
[1166] Wow.
[1167] I really hope so.
[1168] Yeah.
[1169] So you'll be able to play shooters, like Call of Duty.
[1170] Yeah, yeah, that's, that brings up another thing.
[1171] Like, I basically have an aim bot in my head.
[1172] Oh, that's crazy.
[1173] They'll probably have, like, different leagues for people like me, because it's just not fair.
[1174] Wow, is it that accurate?
[1175] It's that accurate.
[1176] And it's faster.
[1177] One thing that I found with the NERLINK is something that kind of blew my mind too, is that when I'm attempting to do stuff sometimes, or I'm thinking it to move in a certain place, Sometimes it's so good that it's moving before I even, like, think it to move.
[1178] It's almost like, if you think about moving your hand, the signal is basically already being sent before you move your hand.
[1179] Like your mind is saying, okay, he's about to move his hand, basically.
[1180] So the signal needs to be sent all the way down and back up in order for you to move your hand.
[1181] So the speed that all that happens, and it's almost a little preemptive, I saw that with the neuralink, where it was moving the cursor before I was actually moving my hand.
[1182] Wow.
[1183] So with video games, stuff like that, you just need to think for it to move somewhere, and it is that accurate, and it's quicker than you can even think.
[1184] So there's no way it's going to, like no one else is going to be able to keep up with it.
[1185] That's going to be wild for something like quake.
[1186] Like a first person, like a fast first person shooter.
[1187] You're running down hallways and you're just catching people and shooting them instantaneously.
[1188] Elon Musk will have a field day.
[1189] Wasn't he like one of the best quake players in the world?
[1190] Was he?
[1191] Yeah, I didn't know that.
[1192] Yeah, I think he was like one of the top quake players in North America at one point.
[1193] I don't, I wouldn't doubt that.
[1194] Yeah.
[1195] I know he's a gamer.
[1196] I know he gets addicted to games.
[1197] Yeah.
[1198] Especially something that's that exciting.
[1199] That's going to be so dope for you, man. You'll be fucking people up.
[1200] Yeah.
[1201] It'll be cool.
[1202] I'll just enter tournaments and I won't tell them I have the Nerlink.
[1203] And I'll, I don't know how I would do it, I guess.
[1204] Yeah, but.
[1205] If I'm doing it all online, it would be kind of cool for you to play them in a tournament, like a one -on -one tournament and fuck up like the best players in the world.
[1206] Wouldn't that be insane?
[1207] Yeah.
[1208] I bet they would play you, just to see.
[1209] Yeah, for sure.
[1210] For sure.
[1211] Yeah, because, like, there's tactics and strategy, especially if you're, like, doing one -on -one death match, where you have to know, like, when the health is spawning and when the weapons are spawning, how to control a map.
[1212] Mm -hmm.
[1213] So they'll have, like, a little bit of an advantage in that.
[1214] But if you just can't miss...
[1215] I'm pretty good at video games.
[1216] I'm pretty good.
[1217] He's getting cocky.
[1218] Yeah.
[1219] I like it.
[1220] Yeah.
[1221] I like it.
[1222] Now, what about VR?
[1223] Has there been any sort of interface that allows you to use, like, Meta's VR or Oculus?
[1224] No, not yet.
[1225] I don't think...
[1226] So, like, a lot of what we've done is just the computer at this point.
[1227] Like, they're planning on doing it into phones.
[1228] I did connect to a Nintendo Switch at one point.
[1229] I was playing Mario Kart.
[1230] And that's something that isn't, like, too far off as well for me to just be able to do that on my own.
[1231] But that's going to be every console.
[1232] I don't see why VR would be any different.
[1233] I think at some point in this study, they're going to do it just to see if it works.
[1234] I don't see why it wouldn't at all.
[1235] The only thing that I would say is that VR actually requires physical movement.
[1236] Like there's a couple games that we have.
[1237] Yeah, but if the brain is already interpreting your motor cortex, the movement of your motor cortex, then you can just think move this and it'll move it in VR as well.
[1238] I think it would work.
[1239] Right, but you're actually moving these handles in VR.
[1240] Oh, yeah, yeah, I see what you mean.
[1241] Yeah, I see what you mean.
[1242] You know, you have the handles.
[1243] We'll just get an Optimus robot and then have him hold the VR handles and then you can control that.
[1244] He's connected to you.
[1245] Yeah.
[1246] Whoa.
[1247] It would be the same, yeah.
[1248] Bro.
[1249] You're going to be inside that thing walking around.
[1250] I'll just, I've always thought.
[1251] Yeah, I've always thought if, you know, you just give me an Optimus robot, I'll have it get one of those, like, baby chest carriers or something.
[1252] Yeah.
[1253] And they can just carry me around like that.
[1254] Bro.
[1255] You imagine walking down the street with that?
[1256] You'd absolutely step on people.
[1257] Do you ever watch Dave Chappelle's old, like, show that he did?
[1258] There was a...
[1259] The Chappelle show, you mean?
[1260] The Chappelle show.
[1261] Yeah, I was on it a couple times.
[1262] I didn't.
[1263] Sorry, I didn't know.
[1264] No worries.
[1265] There was one about home stenographers.
[1266] And it's basically like a little person that they carry around on one of those carriers.
[1267] like they're back, and it's just like a stenographer.
[1268] He's typing down everything you say, and I just want something like that, like a little optimist robot carrying me around on its back.
[1269] Like a kangaroo pouch?
[1270] Yeah, something like that.
[1271] Put it right in the front, so you just be sitting there.
[1272] That's what you want.
[1273] You want like your head on the chest, and it's just...
[1274] Yeah, yeah.
[1275] It'd be sick.
[1276] What's this guy doing?
[1277] Oh, there it is.
[1278] Oh, a stenographer.
[1279] And he just reads back things.
[1280] Oh, what you said.
[1281] Yeah.
[1282] It's really for me. Yeah.
[1283] Oh, I do remember that bit.
[1284] Yeah.
[1285] Yeah, that would, like, I think the future is going to be very interesting.
[1286] And I think there's going to be a lot of really wild discoveries that build upon other wild discoveries and stuff like Neurrelink.
[1287] I'm sure there's competing companies that are doing something similar, right?
[1288] Yeah.
[1289] Yeah, that's what I'm saying.
[1290] I think that, like, I'm pretty sure some of the people who have left Nerlink have gone and either started their own little companies or have gone to other companies that are doing something similar.
[1291] I think Neurlink's advancements now are going to pull everyone else up.
[1292] I think Nerlink will be at the lead for quite a while, but I don't see why companies that haven't been able to achieve but Nerlink is achieving now won't be able to do it in a year or two time.
[1293] Like, especially, like I said, because NeurLink is making everything so open source.
[1294] And there's people like me out there who are just talking about it, like willy -nilly, that I don't see why other companies won't, you know, find some way to catch up over time.
[1295] No, for sure.
[1296] I think with them leading the way and the fact that it's been implemented and it's been successful, and the fact that they're already improving upon the software and how yours and being able to correct issues with it, what is their timeline?
[1297] Like in terms of next being able to use something that allows people to move, that couldn't move, restore sight.
[1298] Do they have like a timeline where they think?
[1299] Yeah, I don't know.
[1300] I keep saying that it's all going to happen in my lifetime for sure.
[1301] I keep saying that it's going to happen in the next 10 years, 20 years, where quadriplegics like me, paralyzed people, won't have to be paralyzed anymore.
[1302] I have this vision of someone being paralyzed going into the hospital.
[1303] getting the neuralink and walking out like a day or two later, which I think is totally possible.
[1304] I think it's going to happen a lot sooner than later, especially how fast all this is moving and the fact that this is like successful now, I think it all.
[1305] I don't know that it would help me per se, even though I said it's in my lifetime.
[1306] Um, part about being paralyzed and quadriplegic is my body is just deteriorated so much that even if they did give me something to like make me able to move again, my body's just so jacked up at this point that I'm not sure it would really help that much.
[1307] I could probably build it back up to a certain point, but even people who have.
[1308] recovered or have been part of studies where they get some movement back, their bodies just don't work the same.
[1309] Because of atrophy?
[1310] Because of atrophy.
[1311] Like one of my, you know, ankles is completely jacked up.
[1312] It's like twisted the wrong way.
[1313] I have to wear this hand brace because if I don't, my fingers are all just like curled up basically.
[1314] And so, like, correcting some of that would take probably some extensive surgery.
[1315] One of my buddies is, like, one of the top ortho surgeons in the United States.
[1316] So maybe I could just get him to go in and fix it all.
[1317] But it would be a lot.
[1318] And I'm not sure it would help and muscle atrophy.
[1319] So.
[1320] I don't know, but I'm, that doesn't matter to me. What matters is that people won't have to be paralyzed in the future.
[1321] Like that, that's more, like that's worth more than anything.
[1322] Well, that's also one of the legitimate uses for steroids.
[1323] One of the legitimate uses for steroids is the people with like muscle wasting disease and, yeah, people who are like severely atrophied and that it allows them to build up tissue better.
[1324] Oh, yeah.
[1325] Maybe that would help.
[1326] Yeah, stem cells, steroids.
[1327] Going to make you a superhuman, bro.
[1328] Yeah.
[1329] I mean, aren't I already one?
[1330] Kind of already.
[1331] Yeah.
[1332] Especially if you're playing you quake.
[1333] I can't wait to see that.
[1334] That's...
[1335] That'd be sick.
[1336] Now...
[1337] With the future of this stuff, it's going to eventually get to a point where it's probably like in the beginning it's probably going to be very difficult to acquire, right?
[1338] Like very expensive.
[1339] But it's probably in the future going to be much more accessible.
[1340] Yeah.
[1341] When do they like if yours is if they complete your trial they find it satisfactory they have like a way to do it when will the average person who is a quadriplegic be able to start being able to use some of this technology I have no idea I know that my study is, like the main part of the study is a year and then five years kind of extensive like follow -up stuff in the study.
[1342] So once that's done, however many people, I've seen numbers up to like a few hundred people have it in this five -year timeline.
[1343] So once all that happens, I don't know what like phase two is with this.
[1344] I would say...
[1345] 20 years, but that's me probably also being very optimistic.
[1346] I have no idea.
[1347] I don't know, like what the FDA is going to decide with all this.
[1348] I don't know how much, how many more phases of the trial need to happen before that.
[1349] I really couldn't tell you.
[1350] But honestly, I think it's within my lifetime for sure.
[1351] And how did they contact you?
[1352] How did you wind up getting chosen?
[1353] Yeah.
[1354] Oh, my buddy's probably out there having a freaking heart attack.
[1355] So basically what happened was I knew nothing about Neurlink.
[1356] I was just lying in my bed one day and I got a phone call from my buddy at like 11 a .m. or something.
[1357] And he, I answered the phone and I was like, what's up?
[1358] And he was like, you know, Neurlink just opened up their first inhuman trials.
[1359] He's like, you should apply for this.
[1360] I was like, cool, like what is it?
[1361] So he explained it to me, gave me like a five -minute rundown of what they're doing and stuff.
[1362] And we applied over the phone.
[1363] Like I just basically told him all my information.
[1364] He applied for me. He spelled my name wrong on the application, which is pretty funny because he was drunk at the time.
[1365] Again, on a Wednesday in the middle of the week.
[1366] At like 11 a .m., he was already wasted.
[1367] Respect.
[1368] Yeah.
[1369] Respect to the day drinkers.
[1370] Yeah, yeah.
[1371] His justification for it is that he was like going to a wedding that weekend.
[1372] He hadn't drank in a long time.
[1373] She's like, I need to understand what my tolerance is.
[1374] So he drank like a whole bottle of fireball or something like that just to see like how he would be.
[1375] So yeah, we did all that.
[1376] And then within like a day or two, they contacted me. And then I went through about a month -long application process of different like Zoom interviews and stuff.
[1377] And then finally culminating in like an in -person interview or in -person like full day of testing where they did like eight hours of tests on me, like different scans, blood tests, your own tests, things like that.
[1378] And then I was just waiting.
[1379] What was it like when you found it was going to be you?
[1380] It was cool.
[1381] It was cool.
[1382] Did you get an email?
[1383] Did you get a phone call?
[1384] They called me. They called me for the first, like, so I applied like September, late September, September, like 19th or something around that day.
[1385] A month later, October, end of October, I had finished all my testing and interviews.
[1386] And then I didn't find out they had chosen me until maybe the end of November, early December.
[1387] And even when they said they had chosen me, they said I was going to be one of the first, like three people that they were doing for the first part of the study.
[1388] So they didn't tell me I was going to be the first.
[1389] They just said, we selected you as one of the candidates, basically.
[1390] Okay.
[1391] And so that was really, really cool.
[1392] And then it was sort of back and forth.
[1393] Do I want to be the first?
[1394] Do I want to wait until I have someone else?
[1395] Because being the first, a lot more risk, obviously.
[1396] And I have the worst version of the neuralink that's ever going to be in anyone.
[1397] It's only going to get better.
[1398] So I was like, maybe I'll let someone else get the first and then I get a better version than the second or third one.
[1399] But ultimately being the first is cool.
[1400] It's something that I just decided to do.
[1401] I was like, this is the best way I can help too.
[1402] If anything goes wrong, I'd rather go wrong to me than passing it up and having someone else struggle.
[1403] Having someone else, like God forbid anyone, like anything bad happened to someone, I would rather it happened to me. and I would rather not have passed up and watched it happen to someone else.
[1404] So, um, decided to do it.
[1405] And I was like, yeah, just let me know if I'm going to be the first or not.
[1406] Obviously I wanted to at that point.
[1407] And then about a month later, they called and they were like, we're going to do your surgery.
[1408] You're going to be the first person.
[1409] I think in December they had told me that it could be me, and they said that we might end up having you be the first, and it could happen as early as mid -December.
[1410] And that kind of stressed me out because I was a little worried that something bad would happen, and I would have ruined Christmas for my family forever.
[1411] I was like, if this is right around Christmas and something bad happens, like Christmas is going to be ruined forever.
[1412] Luckily, they waited like an extra month and a half, but it was cool.
[1413] Like, Like, I kept pretty level expectations through the whole thing.
[1414] I didn't know what was going to come of it.
[1415] I didn't know if I was going to end up doing media or anything.
[1416] It was something that I talked to my parents about, but it wasn't something that I really wanted to do, per se.
[1417] I wasn't wanting to, like, get famous or anything from this.
[1418] There are a couple things that I did want to do, and ultimately, that's why I decided to do media.
[1419] But, yeah, it was cool.
[1420] It was all right.
[1421] You have a very noble and selfless outlook.
[1422] Have you always had that?
[1423] No. No, I would say being paralyzed made me just rethink a lot of things in my life, a lot of my perspective.
[1424] I mean, one thing about being paralyzed is there's, especially being a quadriplegic, you just have a lot of time to think.
[1425] I thought through everything I'd ever done, all the mistakes I ever made, why I was who I was, where I, like where I was.
[1426] I realized a lot of things about myself.
[1427] I realized, you know, that I wasn't the person who I thought I was.
[1428] I always built myself up a certain way and then going back through all of my interactions with everyone, all the mistakes I made.
[1429] I realized I'm painting a much prettier picture of myself in my head.
[1430] than who I, like, actually was, than a lot of the interactions I had with people, you know, actions speak louder than words.
[1431] And if I was thinking I was this great person and treating people, like, absolute, you know, dog crap, basically, like, then maybe I'm not as good of a guy as I thought I was.
[1432] Um, I realized that I wasn't as good of a son as I thought I was.
[1433] I wasn't as good of a boyfriend as I thought I was.
[1434] And I wasn't as good of a friend, like, So I found the reasons why I was doing these things, and I thought about it for probably a few years, just lying in my bed, staring at walls for, you know, eight, ten hours a day, just thinking.
[1435] And eventually I came to this conclusion that partly through my, like, faith, my interactions with God and...
[1436] Partly just because I wanted to be better.
[1437] I wanted to be a better person.
[1438] I realized that there were things that I could do to help.
[1439] And this seemed like my best chance, honestly.
[1440] Wow.
[1441] That's a wild thing to happen to someone, to have a, like, a radical shift in perspective that's forced upon you.
[1442] Yeah.
[1443] Yeah.
[1444] I...
[1445] I heard you say I was watching, I can't remember who I was watching you interview.
[1446] Maybe it was the Tucker interview, maybe it was the Terence Howard interview, because I just watched those ones recently.
[1447] And you were talking about people, like people never having been through anything like extreme happened to them.
[1448] And so, you know, they're never forced to think certain ways or they just, I don't know, they never grow in certain ways.
[1449] That's paraphrasing, but it was along those lines.
[1450] And, I don't know, being a quadriplegic is...
[1451] I kind of make this joke, but it's easier than people think.
[1452] I mean, I just get waited on all the time.
[1453] I get to lie in bed and watch TV and read books and people bring me food and bring me drinks and people do everything for me. Like, it's really not that bad.
[1454] But obviously, it was really, really hard.
[1455] Like, being paralyzed, getting all of the things that I love to do most taken for me. Like I was a really big athlete.
[1456] I played like every sport under the sun.
[1457] And then not being able to play sports anymore was one of the hardest things that I think I've ever had to go through.
[1458] And there were a lot of other things, not having any privacy anymore.
[1459] Like having to have everyone do everything for me, like go to the bathroom, having to take a shower with people, having like my parents scrub me in the shower or having my mom like help me go to the bathroom.
[1460] it's not easy and it's not easy being a burden to everyone around you and people always say like you're not a burden like we love you and we would do anything for you but it like i am i know i am it's not something that someone's going to be able to convince me that i'm not and i understand that they love me and they're willing to do it but at the same time like Obviously, there are things that if I could change, I would, and I can't.
[1461] So I just have to try my best to do as much as I can for those around me. And this is part of what I can do.
[1462] I've thought for years, like, what could I possibly do to help?
[1463] And this is, this is it.
[1464] I think as much as I can, I want to do everything with NeurLink to make things better for people in the future.
[1465] That's a beautiful way of engaging with this, man. It really is.
[1466] And I think what happened to you is tragic, but your perspective is pretty fucking cool.
[1467] It really is.
[1468] It really is.
[1469] It's beautiful to hear.
[1470] And, I mean, I wish you all the best.
[1471] I really hope that this becomes something.
[1472] that allows you to move again and, I mean, and that they keep improving upon it.
[1473] And thank you for risking this and thank you for being the first guy.
[1474] Yeah.
[1475] Yeah.
[1476] No worries.
[1477] People keep saying a lot of weird things about me. Like, you know, you're like an Apollo astronaut.
[1478] I don't see myself that way.
[1479] I know that people keep saying you're the first.
[1480] You're like a pioneer.
[1481] I don't see myself that way at all.
[1482] I just think anyone in my position would have done it.
[1483] I think that.
[1484] I guess it took a bit of bravery.
[1485] I don't think, I just, it, I don't see myself that way.
[1486] I just think that I did it so, um, to show people, like, I did it because I knew that I could.
[1487] I did it because I knew that I was capable of going through it.
[1488] Um, I did, you know, became a quadriplegic and I made it out the other side.
[1489] Like, I feel, I feel good about my life.
[1490] I feel, Like, I managed that pretty well.
[1491] I'm a pretty chill guy, so I feel like I rolled with the punches pretty well.
[1492] And I thought the same thing with Nerlink.
[1493] And so, like, I never thought of myself, like, trailblazing or anything.
[1494] But it's just cool to be a part of.
[1495] And I'm really happy that Neurlink chose me. And I'm looking forward to having some, like, cyborg buddies in the future.
[1496] It'll be cool.
[1497] Yeah, how long before you can link those things together?
[1498] Yeah.
[1499] I guess we'll find out when they get the next patient, like the next participant, like maybe a couple months and we'll be chatting with each other.
[1500] I mean, I've been, you know, having telepathic communications with...
[1501] pager the monkey for a few months no one knows about it but we talk about that kind of stuff all the time um he's oddly obsessed with the new planet of the apes movie but couldn't tell you why what kind of joke is that man you can't crack jokes like that way i don't know if you're telling the truth you're talking to a monkey telepathically No, you're joking.
[1502] Yeah.
[1503] See, I can tell because you have human eyes.
[1504] Yeah.
[1505] Yeah, exactly.
[1506] Yeah.
[1507] Do I do?
[1508] Are you sure?
[1509] I think you do.
[1510] Okay.
[1511] Or they're really good.
[1512] Yeah.
[1513] You know, if they could develop an eye just like artificial intelligence can make images like pretty fucking close, maybe they can make an eyeball that just really does kind of like talk to you a little bit.
[1514] Makes you think.
[1515] Just knowing someone's bullshit.
[1516] Dude.
[1517] Yeah.
[1518] Come on.
[1519] Yeah.
[1520] There's no way I talk to Pager at all, on a daily basis at least.
[1521] Now I'm thinking you do.
[1522] Now I'm going the other way with it.
[1523] Yeah.
[1524] It's exciting times.
[1525] It's very interesting.
[1526] The ability that you have right now is limited to computer interfaces, right?
[1527] What about other smart things?
[1528] Could you interact with other sort of electronics?
[1529] Not really.
[1530] It's all through the computer.
[1531] Just because in order to even interact with the computer, it has to be uploaded with that app.
[1532] And so that's why, like, putting it on a phone or something, you would just upload the app onto it.
[1533] Any sort of other devices, there's ways to, like, connect to them.
[1534] So, like for me, even with the switch, it's through my computer still, but then you run like a cord from my computer through like a converter box and then into the switch.
[1535] So it's all through the computer right now.
[1536] I don't think it's going to be that way forever.
[1537] I think it's going to be much easier to connect to other devices in the future.
[1538] Especially if Neurlink takes off, like I think it will, then companies will start just uploading the software onto it, downloading the software onto it so that way you can connect to it.
[1539] It's going to be one of those things where it's Alexa compatible.
[1540] It's Neurlink compatible.
[1541] Right.
[1542] Right.
[1543] That makes sense, especially if there's widespread implementation of this and it turns out to be a real thing.
[1544] It might be something that someone has to have, like you have to have a wheelchair wrap at certain businesses.
[1545] Yeah.
[1546] Yeah.
[1547] New Tesla phone, I'm sure he's just going to build all of that into that.
[1548] All the optimist robots are going to have it built in.
[1549] Do you think he's going to make a Tesla phone?
[1550] Yeah, he might.
[1551] I think when he said that all they might ban Apple devices because they're going to use open AI.
[1552] I was like, what is going on?
[1553] Yeah.
[1554] I get real nervous when someone way fucking smarter than me gets nervous.
[1555] You know, when he's saying that if AI, basically what he's saying is, I think, to paraphrase, he's saying that Apple wasn't smart enough to create their own artificial intelligence, but they're smart enough to keep artificial intelligence from running rampant through their operating system.
[1556] Like, I don't think they are.
[1557] Yeah.
[1558] I don't trust it.
[1559] I don't trust it one bit.
[1560] But it's going to be in your head, bro.
[1561] Yeah.
[1562] Yeah.
[1563] One day we're all going to have to trust it.
[1564] You know, if I had Scarlett Johansson's voice in my head all the time, I don't think I would mind.
[1565] It would be dreamy.
[1566] It would be okay.
[1567] It would be okay.
[1568] She got a dreamy voice.
[1569] Well, listen, man, thank you very much for being here.
[1570] Thanks for being you.
[1571] And let's do this again sometime in the future.
[1572] We can see what improvements and how it's going.
[1573] Hey, man, absolutely.
[1574] As we move this thing along, then I'm more than happy to come back.
[1575] All right.
[1576] Thank you very much.
[1577] Oh, do you have social media or anything where people can find out what you're up to?
[1578] Yeah, I have an ex, like, at Modded Quad, I think it's called.
[1579] I have like an Instagram and stuff, and I'm getting other stuff up and running.
[1580] I'm going to start, like, streaming more and stuff.
[1581] So it'll be out there.
[1582] You're going to stream?
[1583] Yeah, I did once.
[1584] I did kind of like a test stream.
[1585] I'm about to do another test stream probably this week at some point, maybe in the next few days.
[1586] Then I'm going to stream from like video games and stuff.
[1587] That's great, man. I think people would love to see that and love to hear you talk about your experiences through this.
[1588] Yeah, yeah.
[1589] Yeah, it'll be cool.
[1590] You've got a great perspective, man. You really do.
[1591] Thank you.
[1592] Thank you very much.
[1593] It's a pleasure to meet you.
[1594] Yeah, you too.
[1595] All right.
[1596] All the best.
[1597] Thank you very much.
[1598] Goodbye, everybody.