The Joe Rogan Experience XX
[0] This episode of the Joe Rogan Experience podcast was one of the episodes that we made while we were filming the sci -fi show Joe Rogan questions everything.
[1] So what we did when we did that show, if you've never seen or heard of it, we had a podcast on the show, and a lot of people thought it was a fake podcast, but it was actually a real podcast.
[2] And thanks to the generosity of the sci -fi channel, we are now able to release it as a podcast.
[3] So thank you to Tim and Wayne from sci -fi and to Michael.
[4] and Frank from Arthur Smith that produced the show.
[5] Thanks also to Todd, who was the director of the show.
[6] We had a great time doing it and I appreciate it very much and appreciate you guys letting me do this like the way we did it.
[7] I think it was really fun to do it this way.
[8] We did it exactly like a regular podcast, just me and Duncan.
[9] And in this episode, we went over what's called biohacking or what I call dorks with knives sticking things under their stuff.
[10] skin.
[11] Fascinating subject and I think you'll enjoy the shit out of it.
[12] This episode is brought to you by Legal Zoom.
[13] Legal Zoom is one of the sponsors that we have where we have actually many people that are connected to the show have used it.
[14] On it was formulated through legal Zoom.
[15] Brian made his company in LLC through Legal Zoom.
[16] It's a way you can do things online where you can get legal papers filed and do things without actually having to go to an attorney and sit down and, you know, pay for an office visit and pay exorbitant amounts of money to an attorney.
[17] And also, you have to schedule shit and drive and all that jazz.
[18] You can use legal Zoom naked.
[19] You could be naked.
[20] Nobody can stop you.
[21] You know, you could make it a point to only use legal Zoom when you're naked.
[22] It won't have any benefit or any detriment to the actual legality of the papers that you make.
[23] like what kind of stuff can you do?
[24] Well, here's one.
[25] You can incorporate or form an LLC at LegalZoom .com starting at just 99 bucks.
[26] That's really easy.
[27] They can help out with trademarks, copyrights, patents.
[28] If you've got a great idea and you want to protect it.
[29] If you have a family, you can make a legal will for just 69 bucks.
[30] You can also get living trust, power of attorney, and more in the past 12 years.
[31] Over 2 million Americans have used LegalZoom and they've saved a shitload of money.
[32] Now you get a special discount from listening to this podcast.
[33] Make sure you enter Rogan in the referral box at checkout for more savings.
[34] Legal Zoom is not a law firm.
[35] They provide self -help services at your specific direction.
[36] And if everything spirals out of control and you're in a panic, they actually can connect you with an independent attorney if you need additional guidance.
[37] So check out legalzoom .com and see how they can help you today.
[38] Use the code Rogan in the referral box and now without any further ado um the podcast that we did on biohacking with duncan trussall and me all right jo rogan podcast checking out the joe rogan experience train by day joe rogan podcast by night all day you know what's really crazy that your mom after that um one of your stepdad's was gay um is this is Isn't that like an interesting sort of a snapback from that?
[39] You know, like she was probably around this crazy guy with guns.
[40] It's like, get the fucking.
[41] Give me a guy.
[42] I'll take it.
[43] Yeah.
[44] That's what my mom did.
[45] My dad was like a really violent cop.
[46] And she married a hippie.
[47] Like when she broke up with him, my stepfather had long hair until it was like 15 years old.
[48] He had like a ponytail.
[49] He was a total complete hippie.
[50] Crazy.
[51] Stoner, architect, you know.
[52] Snapback.
[53] You don't want to date on her.
[54] You don't want to be around that.
[55] Was that your dad, Hunter S. Thompson, shooting shit?
[56] Yeah, still is.
[57] Has he still traveled around in a, like, an RV?
[58] Just seeing the world?
[59] Yeah, well, now he's just, no, now he's, like, local in Alabama, but he, like, goes hunting deer still.
[60] And he, like, I told you the fist fight thing, right?
[61] No. This is the weirdest thing.
[62] I was getting a ride.
[63] This is a quick story.
[64] I was getting a ride back from college with my friend, Sean.
[65] You're recording all this?
[66] Yeah.
[67] My friend Sean is driving back to, was driving to Mobile where his dad lived.
[68] And on the way back, I'm riding with this guy.
[69] And he's like, oh, dude, I forgot to ask.
[70] I want to ask you this.
[71] Is your dad's name Julian?
[72] And I'm like, yeah, Julian Trussley's like, your dad assaulted my dad in a bar.
[73] My dad, my dad beat up my friend's dad in college.
[74] When did this happen?
[75] This was, I don't, the assault happened, I think, before me and this guy were in college together.
[76] But if you look at the probability of getting a ride back from college with a guy who was in Mobile, the probability of your dad having beaten his dad up, it's so slim.
[77] It's so slim that, like, my, what are the odds that my, that we both go to this liberal arts college of 500 students and the one other guy from Mobile, Alabama, his father had been assaulted.
[78] by my father in a bar.
[79] Those are pretty small odds.
[80] Miniscule odds.
[81] That's pretty funny too that it would be your dad too because you're like the opposite.
[82] Like I could never imagine you just assaulting someone.
[83] Even if somebody got really mad at you you'd be like, all right, well fuck you man and you just like get out of there.
[84] Like the idea of you assaulting someone is beyond comprehension.
[85] So the idea of your dad being this fucking hollow tooth bullet carrion savage beating the fuck out of your friend's dad.
[86] Yeah, getting arrested.
[87] Jesus Christ.
[88] He got arrested in the parking lot.
[89] Two cops took him down.
[90] And he got arrested.
[91] My dad came home from high school once with a black eye.
[92] Him and his co -worker.
[93] They were doing a business together.
[94] And the business started going bad.
[95] They got in a fucking fist fight.
[96] And he came home and I remember looking at him.
[97] I was like 16 or 17 or something like that.
[98] In a black eye, I was like, what the fuck, man?
[99] Are you done with this?
[100] You're still duking it out at 40, whatever the hell of the old he was.
[101] back then?
[102] Well, it's a classic, it's a classic, it's a classic man. It's weird.
[103] A classic man fought.
[104] Like, it was like, like, think of Hemingway.
[105] That's what's just part of being a man is like you would get in some inevitable fistfight.
[106] And in those days, it seems like you don't get in a fist fight and there's lawsuits and legal stuff.
[107] It's just like the other guy accepts getting his ass kicked.
[108] You're the victor and that's it.
[109] Or you get your ass kicked and you forget about it.
[110] That seems like the old way it is.
[111] Now, if you get in a fist fight, you end up.
[112] in you know serious legal trouble you get an instant lawsuit but it was it used to just be like when you see elk smashing into each other in the wilderness that's what it used to be being a man you'd just run into a guy and just start punching each other and then that was it those were the days back in the good old days before writing shit down hey once people had a few mistakes They started writing shit down.
[113] They were like, oh, yeah, okay.
[114] Yeah, blame it on Hammurabi.
[115] It's all Hammurabi's fault.
[116] And that gentleman, his name is Sherid?
[117] Sherrod.
[118] Sherrod.
[119] Sherrod and Lucas.
[120] And they're from Grindhouse Wetware.
[121] And Grindhouse Wetware, first of all, greatest name for a company ever.
[122] What does that mean?
[123] So it comes from the name Grinders.
[124] Get real close to that.
[125] That's what we are.
[126] We're grinders.
[127] uh grinders are kind of like uh what happens when you mix open source hackers and body modification with a dash of transhumanism uh and wet wears uh is this you know your body so we're literally grinding our biology uh there's all these interesting uh little names that you guys have for yourself first of all grinders we're grinders we're transhumanists where there's a lot of uh self -defining going on there?
[128] Well, it's not exactly that we're self -defining.
[129] We're just guys who we really like technology and we all grew up with, like you talk about growing up with Hemingway and classic men and we were all about fighting.
[130] We grew up, we were protected and sheltered and grew up with Star Wars and grew up with these stories about robots that could think and people that couldn't live any longer and had to augment themselves and cyborgs.
[131] And well, cyborgs were really cool.
[132] We thought they were really cool.
[133] And, you know, I was in my 20s when we hit the year 2000.
[134] First thing that happened, New Year's Eve, 2000.
[135] And after I got over the shock of nothing hitting the wall, nothing falling out of the sky, I turned to my brother and went, where is my fucking jet pack?
[136] And that kind of summed it up.
[137] I felt that way.
[138] My buddies felt that way.
[139] I didn't realize it, but there were a lot of people who felt that way.
[140] And we were all broke and we didn't know much about science.
[141] And somewhere along the line, like I guess in 2006 there was a kid in Arizona who realized that he could start stimulating his nerves with magnets, with magnetic implants.
[142] He went to a body model guy and got an implant done in his fingers.
[143] It started going from him to a guy in Europe.
[144] And And in, like, 2008, there was a lady Quinn Norton who wrote for Wired and wrote for some other places, and she did it.
[145] She went out and said it was a bad idea.
[146] Don't do it.
[147] Other people started doing it.
[148] Okay, so, well, let's back up.
[149] First of all, explain what you're talking about.
[150] You're talking about biohacking.
[151] You're talking about doing something to your body that enhances it, integrating and incorporating technology into your body.
[152] and the first guy was the guy in Phoenix that did the Arizona rather was it Arizona was it Phoenix Arizona yeah yeah they did the magnets in his fingers yeah as far as we know that's a first person he's the pioneer first first document and what what benefit is there of having magnets in your fingers like what would that accomplish for him um besides making it easier to put your hand on a refrigerator easier I think it's annoying it'll be an annoying moment you're always getting stuck to things yeah it doesn't it doesn't it doesn't It doesn't exactly work like that.
[153] These are really tiny magnets.
[154] You have magnets in your fingers.
[155] Yeah, yeah.
[156] I got like, uh, got one around there.
[157] One around there.
[158] Wow.
[159] So, so you implanted magnets in your fingers?
[160] Yeah.
[161] Why did you do that?
[162] Uh, I was drunk.
[163] No kidding.
[164] I, kidding.
[165] Um, I found out about this.
[166] I was really interested in everything we've been talking about.
[167] I ran into a childhood friend of mine who got it done last year, and when he showed me what he was doing with it, I had to do it.
[168] What was he doing with it?
[169] It gives you the ability when you implant a magnet in a nerve -dense area of your hand, when you were around an electromagnetic field, that magnet vibrates.
[170] And so you can feel that.
[171] Now, there's a property of our brains called neuroplasticity.
[172] when you provide a stimulus over a course of time, like about six months, your brain starts adapting to it.
[173] It starts building something new.
[174] And I was always fascinated even as a child of the idea of having an extra sense, not necessarily like ESP, but just an extra sense.
[175] Or be able to hear a sound that no one's ever heard before, see a color, no one ever seen before.
[176] all those stories from don't do drug pamphlets that were like, I can see the music, I can taste sound, that sounded amazing to me. So when I heard about this extra sense, I jumped out this chance to incorporate that in my body.
[177] But what extra sense, the extra sense of using these magnets to detect an electrical field or to vibrate during an electrical field?
[178] Is that what you're saying?
[179] Yeah.
[180] And how are you to define an electrical field?
[181] Power station?
[182] It's an electromagnetic field.
[183] So anytime that an electron travels in a straight line, you have a magnetic field that's coming out kind of in a circle around it.
[184] Okay.
[185] It radiates around it.
[186] Right.
[187] If you have a coil, a coil of wire and electrons going through a coil, there'll be kind of like a donut -shaped magnetic field around it.
[188] So those sort of fields, they have property of volume and property of frequency.
[189] They vibrate at a certain frequency.
[190] and they're loud or soft.
[191] Like a transformer, like an electrical transformer outside of a building that hums?
[192] Absolutely.
[193] So if you were near that with your fingers having the magnets in them, what would happen to you?
[194] I can feel those.
[195] Sometimes when people would ask me at the beginning what it was like to feel this stuff, I would describe it like an effervescent feeling like bubbles.
[196] It felt like bubbles.
[197] And it was really interesting to me. It felt like bubbles?
[198] Yeah.
[199] How does that work?
[200] Um, well...
[201] That's the weirdest explanation for something that...
[202] Her description.
[203] Oh, it felt like bubbles.
[204] Yeah.
[205] I don't even know what to say.
[206] You mean it felt like fizz?
[207] Yeah, kind of like fizz.
[208] Like pop tarts?
[209] Like pop rocks, rather.
[210] Seafone, I think, is what he means.
[211] Seafone?
[212] I don't know.
[213] Froth, like dog froth.
[214] Every person that has the implant uses completely different language to describe it.
[215] Yeah, it's been really fascinating to run into other people with implants and talk about this stuff because we are describing the same phenomena.
[216] And it's like, you're asking me, me to describe a color that you've never seen before.
[217] How am I going to describe squant?
[218] It's squant.
[219] Well, is that blue?
[220] No, it's squant.
[221] Is it purple?
[222] No, it's like tasting in orange.
[223] Oh, so it's orange.
[224] No. Is there any negative repercussions to have these magnets installed in your body?
[225] Well, the first time that someone actually said something that made sense as far as a negative thing, I saw a speech by Quinn Norton where she said, hey, don't do this.
[226] This is a bad idea.
[227] hers shattered What shattered?
[228] Her magnetic implant.
[229] Shattered.
[230] Okay, so when a magnet receives either a lot of heat or a very strong force, it loses its magnetic pull.
[231] But more importantly, these magnets are made out of rare earth elements.
[232] Mine's like a cobalt alloy.
[233] Those things aren't really healthy to take inside your body.
[234] If it shatters, so this is parylene coated.
[235] Some people use silicone.
[236] Crazier people use hot glue.
[237] So it's essentially, it's poison.
[238] You're dealing with a toxic element in your body.
[239] If it's not covered.
[240] It's not covered.
[241] When it burst in her, she actually consulted a doctor.
[242] And the doctor was like, well, here's the thing.
[243] We could go through all the trouble of pulling it out.
[244] But it's so tiny and it's so little that's really not going to bother you.
[245] and so they left it in and over the course of time that magnet pulled itself back together she never got the sense back but it pulled itself back together and it didn't harm her creepy little creature if you guys are interested moving around inside of her fingers how'd she break it what she slap a cop car or something how did she shatter the name?
[246] She didn't say furious masturbation I have no idea just furious possessed eyes rolled back One shoe on, screaming, like a tiger caught in a barbed wire fence.
[247] Clank.
[248] Yeah, it could be.
[249] Could very well be.
[250] If I had a guess.
[251] But that usually doesn't happen with the magnets this size or something.
[252] I don't have a vagina, so I wouldn't know.
[253] How could you know what would happen when a woman's in the middle of a furious masturbation session completely on a thick fucking half a pound of meth just slam it home you never know are you asking how can we imagine what happens yeah how else does a magnet get shattered that's just the good the guess is any yeah like maybe her orgasm was so strong it admitted a magnetic feel meshed up horny as fuck maybe on top of an old truck middle I'm just coming up with a scenario how do you these are under these are under your fingers no that's a lot larger but if you guys are curious about that's a cock magnet yeah yeah there you go so something for your cock that's that's a need for you know just in a cleaner way of expressing this question are there any gentlemen perhaps that have these magnets put in their nether regions I have no idea I think so it's been spoken about it's been spoken has it been spoken or has it been written on our website somewhere It's been spoken about The closer you can get that to your face Seriously, the better Just pull it right up to your face Usually Or move your chair a little bit Yeah, when it comes to stuff like that People talk about it And then don't do it Oh so they're saying, hey man I've been thinking about putting magnets To my cock Just so you start thinking about their cock Is that what they're doing?
[254] That's what they're doing And then you're forced Hey you know Mike he's going to put magnets And his cock They're like why are we sitting around talking about Mike's cock There's a lot of shit to talk about in the world But Mike tricked us Mike brought up putting magnets in his cock He knows how to reach you guys Okay You can't just talk about sex That's too easy No he has to talk about what do you guys like You guys like putting magnets in your body So he starts talking about magnets in his cock But people already do that with like Absolutely Absolutely I'm saying it's totally serious This is I've figured it out I'm like Cojack or something shit But people already do that with body modification You know, they already get stuff pierced I've seen, allegedly, seeing some stuff online of just horrible ideas, you know, that someone followed through with with their junk.
[255] Horrible, like the guy who splits his cock and a half?
[256] The guy, he's like it's only one guy.
[257] That's like saying Bigfoot's only one thing.
[258] There's a lot of Bigfoot.
[259] You're saying more than one guy's flayed his cock and a half?
[260] Yeah, why do you do that?
[261] It's supposed to feel phenomenal.
[262] It's supposed to.
[263] Doesn't it fall apart when you're having sex?
[264] It's like a hot dog that's been sliced and thrown on a grill.
[265] I hear that it's kind of like shooting, like a shotgun.
[266] Oh, Jesus Christ.
[267] Wow.
[268] Jesus Christ.
[269] I think you guys...
[270] I think you caught your dick and then you just tell everybody it's awesome.
[271] You don't want to admit you're an idiot.
[272] Is this like when they told me like, we were going to have a punch yourself in the face contest and I had to start first?
[273] Exactly.
[274] Exactly.
[275] Preposterous.
[276] You guys are in a way, what you're doing is you are turning yourself into a lab rat with the intention of gaining a kind of low level superpower like a really really low on the scale of superpowers the ability to feel EMF fields is that what they're called yeah to feel that that's a really low power but it's still you have more superpowers than me I can't feel that you're still ahead of the majority of your species but there's a risk to this you're risking your life so that you can feel what it's like to be a refrigerator magnet?
[277] Anything that you put in your body, you risk getting sepsis from.
[278] There's a risk of sepsis.
[279] So, yeah, there is a risk.
[280] What exactly is sepsis?
[281] That sounds like septic?
[282] Is that related?
[283] What does it mean?
[284] It means some sort of infection?
[285] Yeah.
[286] Basically bacterial infections of any sort.
[287] So I've had a buddy who's a surgeon who's told me, hey, don't do this.
[288] This is a bad idea.
[289] Right.
[290] Good for you.
[291] You got that done, but don't put anything else in your body because any implant can go bad.
[292] Wow.
[293] I got my nose pierced in Venice Beach when I was 15, and you should have seen that, man. Swell up.
[294] Oh, my God.
[295] It was just so nasty, man. My nose got swollen.
[296] I remember pushing against it in front of the mirror and just, like, the amount of pus, like where you get proud of the amount of pus coming out.
[297] You know, that can kill people.
[298] They get staffed like that, and it winds of eating your whole nose off.
[299] Oh, yeah, man, I got lucky, you know.
[300] You should never get your nose pierce.
[301] That's a ridiculous thing.
[302] And there's no superpower in that.
[303] You just look like a dumb ass.
[304] What he said.
[305] I'm with him.
[306] So this is a tangible extra sense, though.
[307] This ability to detect a magnetic field.
[308] I mean, you really can do something that Duncan or I can't with your body because of these additions.
[309] It's profound.
[310] It's something I didn't understand when I got it done.
[311] um what started out like feeling bubbles um now like uh i'll go to home depot and i'll be feeling stuff because it's oh my god so fun to feel stuff feel what are you feeling um mainly magnetic stuff some freak hanging around home depot touching all the stuff oh my gosh it's a drill oh wow doing um no but some guys over there was pantsed out by the lawnmowers it's a magnetic feel oh oh oh oh oh oh Hey, what's your favorite?
[312] This means you probably have, like, people have favorite flavors.
[313] Do you have a favorite machine you like to put your fingers near?
[314] I've had situations like I've been around really strong magnets, and when I first was around a really strong magnet, I felt a feeling in the pit of my stomach like dread, and it made me want to pull away and run away from it.
[315] Because you're worried you're just going to get clinked?
[316] No, it wasn't a logic.
[317] thing.
[318] You could never get an MRI.
[319] It wasn't a logic.
[320] Well, you know, the funny thing about that is I'm working out what our sensitivities are.
[321] There's been a lot of debate among the people who've had these things on how close we can get to an MRI.
[322] There are some people who talk about it being dangerous because it can move around.
[323] Since this is subdermal, it's not anchored in any way like other subdermal body mods are.
[324] it could move your magnet a bit one way or another.
[325] Someone's got to test it.
[326] But there are people who've claimed through the network of people who install these.
[327] There are people who've claimed that they've had MRIs done and felt their magnet dance around and that's it.
[328] Whoa.
[329] But the MRI is a super powerful magnet.
[330] Yeah, you're talking about the tiny ones are like 1 .5 Tesla's.
[331] but people go into those things with sweatshirts with zippers on them they do yeah i talked to a guy who operates one in a research facility and you that's why i'm like funny you should mention this because okay but that's not standard operational procedure is one you're not supposed to get down to a robe i just had one a couple of days ago you take your clothes off that you don't you don't are you sure they weren't just fucking with you are you sure you don't know anything about MRI You got magnets installed in your hand?
[332] That's crazy, man. You don't go in there with zippers on.
[333] What kind of a fucking doctor are these people going to?
[334] It's a research facility.
[335] It's not a doctor.
[336] First, Joe, can I get what?
[337] You said someone's died from an MRI?
[338] What is that?
[339] Well, I was trying to say is if you have metal in the room and they turn the MRI on, a child died recently accidentally because they had an oxygen tank in the room and they weren't aware they had it in there.
[340] So when you turn on the MRI, the fucking oxygen tank comes flying into the machine.
[341] It's a giant super and powerful magnet And it killed them By the time they could shut it off He was already dead I mean it's insane amount of pressure you're dealing with And it's just because someone's dumb enough To have metal in the room So I really find it hard to believe That any doctor would allow someone To go through an MRI with a zipper on That's preposterous That's just not what they do So I question who you're talking to Talking to crazy people Lying about putting magnets in their dicks And wearing zippers Into MRI You gotta hang out You gotta hang out with a higher quality Group of humans A higher quality group of humans But there is magnetic shielding because, you know, people that have pacemakers or other implants.
[342] Oh, they put like something over you to stop that, like a suit or something like that?
[343] Yeah.
[344] Huh.
[345] That's interesting.
[346] But you're not doing that with your fingers.
[347] You just let them tingle if you get in there, allegedly.
[348] Don't know.
[349] That's something you really should be careful about.
[350] But I think it's something that should be tested.
[351] I think if we've, even if you're not, I'm not advocating that someone with this in their finger go in and test it.
[352] But, you know, even if you can replicate the consistency of human and put, like, a human flesh, put the magnet in, and then expose it to something in, like, 1 .5 Tesla, just to see what happens.
[353] Now, you have this, which you've already done with the magnets.
[354] What is, what's the next level thing?
[355] Obviously, everything progresses.
[356] So what are people considering doing now?
[357] So, um, there's, well, a lot of people are building devices that interact.
[358] with the magnets.
[359] So you can think of it as, because it interacts with electricity and magnetism, it's kind of like a really low level input.
[360] You can input anything.
[361] So if you've got an external device that's measuring, say, the amount of CO2 in the room, you could hook that up through a coil around your finger, around your magnet finger, and have that information transferred into like pulses or how much it vibrates or whatever.
[362] So you're literally using.
[363] an external sensor to give you data any data you want and have it transfer that to the magnet and after some training it becomes intuitive so some people have done it with mainly sonar and thermal like a thermal laser now I know the people have talked about the ultimate the ultimate future of incorporating technology and the human body is the ability to take the mind and download its consciousness somewhere into an artificial creation but somewhere along the line is probably going to be some steps right and you guys are kind of taking like one of these steps like by putting magnets in your body when do you think you're going to see someone who puts a cell phone in their body is that coming is it going to be like a neural interface that allows you to get online with Wi -Fi well we're somewhere towards words that.
[364] That's something else we're working on.
[365] So our CTO, his dream, the thing that brought us all together, wasn't magnets.
[366] That was ancillary.
[367] It was love, right?
[368] Yeah, well, of course, it was love because we all kind of knew each other, and we all had this love of the future, and we all kind of were told to sit back and let Motorola take care of it, and none of us wanted to deal with that.
[369] But he wanted to build something that he believed that we have the ability to do this stuff ourselves.
[370] We don't have to wait for some big company to do it.
[371] We can do it cheap.
[372] We can do it ourselves, design it ourselves.
[373] And since we know people who can install magnets under our skin, why not an implant?
[374] Why not something?
[375] Well, the goal was to have something So basically what it does is it's a rose compass that electrically stimulates you.
[376] It's It's probably here on your calf, and it's always pointing north, so that you always have an intuitive sense of where north is.
[377] Now, if you've got the chip talking to some external thing, you could hook it up to a map system like Google Maps and be like, yo, take me home.
[378] And then you have an intuitive sense of where home is always.
[379] What?
[380] Wait a minute.
[381] What?
[382] Hold on.
[383] Wait, back that up.
[384] you have a compass installed in your body and the body how does it run it just runs on the magnetism that's in the compass it's all mechanical there's no electronics involved right it can be powered wirelessly so it's got like a wireless coil on it so it's powered somehow by an external yeah okay yeah and so you can like hover it over it that's that's what the the dream is and what is the method of communication to the mind that explains where north is or how to get home so it's uh in the same way that you're using a really simple system in terms of the magnet you're not using any magic it's just vibrating for this it's just applying a very very small amount of electricity enough that you can there's there's a sensation so it's not like blaring and it's not shocking you but there is a sensation is this completely theoretical the idea of being able to find your way home or is there any we we have parts of it parts of it built and actually when we came together and we had that idea we said okay this thing has way too many features.
[385] So what we're going to do is we're going to build, we're going to build an implant that's kind of a halfway step to there so that we build the ground towards it.
[386] And that's actually, it's actually this thing here.
[387] And so what this thing does is it actually, it's a quantified self health metric.
[388] And this is just going to take data.
[389] You can check this out.
[390] This takes medical data and sends it to your phone.
[391] So if you can imagine having maybe six months' worth of your medical data, and it's your data.
[392] It's not anyone else's data, your data, and you can do whatever with it.
[393] And so we wanted to kind of do a halfway project so that we get familiar with implant technology and then move towards really cool shit.
[394] I understand the idea of a compass.
[395] that you could power it somehow.
[396] But what I don't understand is how the compass is communicating with your mind.
[397] So in the same way that the nervous system is communicating with your mind, it's not sending it wirelessly to your brain.
[398] Right.
[399] There are things in your body that's communicating to your nervous system that then goes to your mind.
[400] Right.
[401] And so you're literally making a new mini organ.
[402] that or another part of your body that interfaces with your nervous system through electricity and then you're just saying it's like a vibration or something it's like a low level pulse or something yeah it's a low level pulse yeah I understand that I understand that you would get a sensation but I don't understand how it could be directional I don't see the mechanism for that well it's it's a it's a rose compass right so what does a rose compass mean it's uh so it's got like a clock yeah it's got north south east west and then like northwest oh that's called a rose compass yeah okay so it's a compass yeah and yeah it's it's a compass okay and so uh let's say let's say i'm facing north the stimulation will be going this way right but if i if i turn it's gonna start so start stimulating it's directional in that wherever the north is it gives a signal pointing where the arrow goes yeah so the arrow it doesn't just act as a beacon it's somehow another like sends a charge off in that direction yeah and then it becomes intuitive once you start determining what that signal means like it becomes a new thing sort of like sound like you hear sound over here and you know it's coming in that direction yeah so it's crazy it's intuitive in the sense that like i would be working with uh with some of the guys in the lab and you know they've got two or three implants and they would say leave the sawing iron on and they'd go to reach to grab something and just them feeling the intensity of the soldering iron would make them pull back.
[403] Right.
[404] And so it's not like they're like, oh, shit, that might kill me, let me pull back.
[405] It's more like every time you reach over here, something screams at you or something like pokes you.
[406] It's like people who are allergic to cats.
[407] If you're around a cat.
[408] You know, like that's a kind of superpower.
[409] A person who's allergic to cats can walk into a house and be like you have cats right you have cats they can sense the cat because they start sneezing and coughing especially if you're dirty and you don't clean your house yeah well generally people have cats cats are out there dress little creatures pretty filthy when you walk into a house and you get that first waft of the cat litter box mixing in old bananas so nice you have a box of shit in your house yeah nice dirty box of shit and a filthy witch animal peering out at you from the wall ready to smother your baby while sleeping Suck the breath right out.
[410] Are you guys going to destroy cats?
[411] That's awesome.
[412] I think this, people like you, make me love existing today because the idea that there are human beings at this very moment, and I picture you in a basement, in basements with straight razors, shaking, cutting themselves open and just putting chips inside of their body.
[413] No, they go to doctors and have all this done, man. Come on, wait, dude.
[414] We're in a fantasy.
[415] See?
[416] Lever -stracts in their mouth.
[417] Bight down on a twig.
[418] You guys are both right.
[419] You're both kind of right.
[420] See, yeah, you don't all go to a doc.
[421] What doctor do you go?
[422] What doctor do you guys go to?
[423] I know.
[424] Dr. Evil.
[425] Dr. Evil.
[426] And I imagine that before you guys put the magnets in, you're snorting oxy cotton or like, you're definitely on pain killers, right?
[427] Well, I got my implant on.
[428] I got it done with no pain meds.
[429] And, oh, my gosh, it was a rush.
[430] I never had that much.
[431] That's a new meme.
[432] Dude is standing there with a razor blade, and the photo says, I got it done without painkillers.
[433] And, oh, my gosh, what a rush.
[434] I don't.
[435] It was, it was something else.
[436] The amount of endorphins you got, it's trauma.
[437] It's trauma.
[438] Your body's going through trauma, and your body releases all that endorphins, all at once.
[439] You, you definitely, you don't remember the fact that you just got hurt.
[440] going to shock i don't i don't advocate people doing this uh i don't i don't advocate people opening themselves up at all uh that's not going to stop anybody it's too late right now there's already seven hundred teens with exacto knives putting their dad's wristwatch bandana in mouth tugging well but kids if you're watching this seriously don't kids it never ends well someone already tried it and they're kind of fucked up because of it who's that who's that who tried what um uh so one of the people that pioneered this uh this this uh lady out in Scotland called leps anonym she uh did this talk called cybernetics for the masses um where she advocating you know implants and uh she did it herself she she sterilized uh what did she used she used vodka She used vodka and hot glue to coat the magnets.
[441] She, in her speech, she told everyone, hey, I don't care how hardcore you think you are, get a spotter.
[442] Yeah.
[443] Because you will pass out.
[444] And that, that to me, that to me. She also used a vegetable peeler.
[445] Yeah.
[446] That to me is not, that to me is not DIY cybernetics.
[447] That's irresponsible.
[448] That's a crazy bitch.
[449] I can tell you that when you told me your name.
[450] That's a shit.
[451] You don't leave it home with your keys.
[452] I've seen that speech wept in and I'm gay, and I emailed her directly after the speech because I wanted to get her on my podcast, but she, first of all, I didn't know if she was a she because she's androgynous.
[453] You can't really tell.
[454] But she looks like she fell out of a Philip K. Dick novel.
[455] Like, she is a really fascinating being.
[456] Like, it's a really curious being.
[457] And she seems kind of nonchalant about what she's doing.
[458] And it reminds me of you guys a little bit, because you guys seem to be brushing off the fact that what you're doing can make it so that in 20 years, You are just basically a trembling drink machine.
[459] Somebody could put a cocktail in your hand and mix it up because you have some kind of severe neurological damage from the weird chemicals in your magnets going into your brains.
[460] It could be true.
[461] You're right.
[462] You're right.
[463] Totally right.
[464] We probably would be dealing with something worse than a Parkinson's -like state.
[465] Can you get up on the mic, man?
[466] Sorry.
[467] That's okay.
[468] You're right.
[469] There are risks.
[470] when left gave that speech she gave it with the intent that people out there would see her process and go oh my god you could do it so much better you could do it so much safer and when people are talking about what grinding's about and why it's called grinding that's part of part of the spirit of being a grinder is being open about your processes because you're hoping that someone will look at your processes and go oh yeah I see what you did I see how you made it better I can make it even better and even more safe.
[471] But why Grindr?
[472] So it originates from a comic book called Dr. Sleepless.
[473] And there's a...
[474] God, you guys are dorks.
[475] No, they're not dorks.
[476] It's awesome.
[477] I want to be a grind.
[478] It's not a bad thing.
[479] But you guys are dorks.
[480] And so there's a...
[481] Not dorks.
[482] It's not a bad thing.
[483] I'm a dork too.
[484] You're a dork too.
[485] How dare you.
[486] You're a dork.
[487] I guess I am a dork.
[488] You read Walking Dead comic books.
[489] You're a dork.
[490] Oh, man. That's intense.
[491] And you tell me I have to read them.
[492] That's intense, man. you got to read him he's a dork that does not wrong with it man it's not wrong with being a dork don't be scared of being a dork okay well fine I'm a dork too I'd rather be a dork with magnets under my fingers I'm not gonna be a dork I'd like to be a dork it's not bad to be a dork but the fact that you guys call yourself grinders out of a comic book that's clearly pretty dorky yeah it is but it's not bad see you embrace being a dork I think the term dork is a dorky term no it's an awesome term and that's what we were when we were out there looking for Bigfoot.
[493] We were two dorks.
[494] You're right.
[495] Look at two dorks looking for Bigfoot.
[496] By the way, having the time of our lives with fellow dorks.
[497] He's not wrong with being a dork.
[498] No disrespect intended.
[499] No disrespect.
[500] You're not taking yourself that seriously.
[501] You're not taking yourself that seriously.
[502] You're naming your whole group after comic books.
[503] It's not wrong with that.
[504] A part of what we do is we're still using the scientific method.
[505] We're not just like.
[506] Sure.
[507] Well, what we're doing is we're putting our method out there for it to be criticized so that it's safe in our community.
[508] Right.
[509] And it's not, you know, cut yourself open with vegetable peelers, and it's not have the method be held up by the FDA for 50 years.
[510] It's about knowing what the risks are as an individual and deciding whether to take it or not.
[511] Now, that piece of plastic, silicon, whatever it is, what is that exactly made out of that you showed me earlier, the chip?
[512] Oh, that's a regular circuit board.
[513] There's all sorts of bad shit in there for you.
[514] Yeah, what was it made out of?
[515] Do you know?
[516] Fiberglass, copper, metal, plastics.
[517] Where would you put that?
[518] Inside a silicone, silicone shape.
[519] There's a bioproofing process.
[520] Like a sheath?
[521] Like a condom, silicone condom sort of a thing?
[522] More like a...
[523] Something more secure.
[524] You take a CNC, you make a CNC mold reverse.
[525] So this is an unpopulated board because all of our regular prototypes are being tested.
[526] On who?
[527] On, for functionability.
[528] Yeah, like heating it, heating it, crushing it, exposing it down, sorts of crazy.
[529] Once we're confident that we have, like, something that can handle the stresses that we think someone who would have something in it for a year, which is our goal, like, get it inside someone for a year and take it out.
[530] Then we're testing bioproofing.
[531] Bioproofing is doing a pressure -injected silicone mold.
[532] Oh.
[533] Wow.
[534] That just sounds like it would hurt.
[535] Do you guys have in your...
[536] That's not inside you.
[537] I know.
[538] I know, though, is, like, what part of the body would you install this thing?
[539] Well, Tim's been talking about the forearm.
[540] Why the forum?
[541] Because it would be cool to look down at your...
[542] your arm and see LEDs lighten up underneath your skin.
[543] So that's part of it.
[544] You guys aren't just pioneers trying to redefine humanity.
[545] You just want to have shining lights under your skin.
[546] And they want to capture some dork pussy.
[547] Let's be honest.
[548] I think that captures many different types of toys.
[549] This is our jet pack.
[550] I bet, man. If you're a super baller in the grinder community, if you walk around with a glowing implant under your arm.
[551] Yeah.
[552] Do you guys have in your community somebody who you know will put?
[553] anything under their skin like somebody who walks around jangling with metal pieces and lights is there is there a person who's got the most implants i don't know i don't think we've reached there yet what's the most incredible like the most extreme implant that you're aware of maybe rich yeah probably uh fucking rich rich what's going on with rich so uh he's got uh he's got uh two magnet implants in his finger and he actually has um he has a coil like a huge coil that he uses to uh to be able to like hear radio stations and all sorts of craziness they're basically implanted earphones wait a minute he has so he has magnets and his fingers in his ears in his ears his ears yes magnets yeah so he's magnets and his fingers in his ears is that what you said yeah he's got um he's got like two in his ears man that's scary yeah Yeah, yeah.
[554] They'll be letting a dude operate on your ears and stick your magnets in that?
[555] And so what's the coil?
[556] The coil is like any coil.
[557] It's stimulating these magnets, makes a magnet vibrate.
[558] So he hooked up this coil to a little amplifier, and that amplifier is sending sound through it.
[559] Wow.
[560] Like your standard speaker, right?
[561] Right.
[562] You send sound through a speaker coil, makes a magnet jump back, or makes a magnet force a coil back and forth, and you make sound.
[563] So he's using this, sending an electromagnetic waves to the magnets in his ears.
[564] His magnets dance around, and they give him sound.
[565] So he was able to use a Bluetooth connection to his phone to hear his phone through his invisible earphones.
[566] That's insane.
[567] Yeah.
[568] So, yeah, it sounds really crazy and dangerous, but I think we're also trying to establish a precedent that we want like we no matter what organization pops up there's going to be cyborgs in the future and I think that strong words I think that that's what we're moving towards and I think that we want to move to a future where people are openly experimenting with themselves instead of having to go to one source you know like company X to get their implants and I think there's there's a sort of liberation that you have by either making your own implant or knowing knowing what it was made of who coded it what the code is and and all sorts of stuff so in our community we share all of the information there's there is no weird outside control because everyone knows how it was built and if you don't like how it was built you can change it aren't you already a cyborg if you wear glasses isn't it already incorporation of technology into the human body I try to avoid like the word cyborg because the moment that people start saying cyborg everyone's like you're not a cyborg cyborgs are this isn't that mean look someone made with a machine these things you have sitting on your face and that's how that's how you see better and without those you wouldn't see so good i i think i thought a cyborg it means that it's part of you now you can't take it off like it's is that what it is permanently in you yeah you've merged with a machine hearing implants those are first steps yeah yeah yeah hearing aids but you know what else is crazy and dangerous learning to build an airplane.
[569] Like, think of the Wright brothers way back when they were telling people, we're going to try to make a device that makes us fly like birds.
[570] Frightened, terrified people, like, are you out of your mind?
[571] You're going to fall.
[572] You're just going to fall to your death.
[573] And it's true.
[574] There's an implicit risk whenever you're trying to create some new invention.
[575] And if you wait for the FDA, if we sit around and wait for Apple or for Microsoft to get this stuff past, it is going to be 50 years.
[576] a hundred years because we need you guys to turn it a few of you are going to have to turn into the toxic avenger a few of you are it's true man you need it because they've made themselves guinea pigs for the sake of our species it's well i think they did it because they're enjoying it well it's worse than a 50 year weight what's what's bad about it microphone what's bad about it thank you no problem uh what's bad about it is the fact that there are people who can't afford what an implant would cost And, like, with cell phones, I saw cell phones be a tool of people who had tons of money.
[577] And I saw that with every other technology.
[578] And when you're talking about modifying your body and the ways that we're talking about it, and it seems like it has so much potential to be a useful tool for people, to make that something that's only available to people with large amounts of capital seems like...
[579] Wrong.
[580] Well, then, I don't want to live in that.
[581] The Kurtzweil has already addressed this.
[582] quite accurately when he talks about technology being applied in the cell phone world.
[583] He's like, initially it was just a privileged few.
[584] He goes, now, like, 70 plus percent of the population on the planet has cell phones.
[585] You can go to the jungles of Africa you find people who have cell phones.
[586] People have cell phones everywhere you look.
[587] So it starts off like it's really expensive.
[588] But if it proves to be effective, it becomes accessible to almost everyone.
[589] Which is true.
[590] But if you look at waves of tech.
[591] technology, right, and the evolution of technology, the people that get it first get to dictate how the technology is used and they get to dictate the political landscape, right?
[592] So my, my, so, so for example, you look at the, cell phone users don't get to dictate the political landscape of cell phones.
[593] So that's the one example that we use.
[594] Yeah, but it's not cell phone producers.
[595] You're talking about people who buy them.
[596] You're talking about people who can afford to use them.
[597] Now, they're saying that they're saying that when a cell phone is invented, when Verizon builds a cell phone, they get into an.
[598] agreement with the NSA and there's code inside your cell phone that makes it so it's tracking you.
[599] Imagine that technology applied to some kind of implant.
[600] Imagine if they came up with some neural prosthetic where, you know, the neuroprosthetic that you can attach to your hippocampus that allows you to record your memories.
[601] Imagine if the government invents that and has some secret code in it, they can shut off your brain.
[602] Okay, but that's not new what they were talking about.
[603] They were talking about privileged people who have access to this power.
[604] We were talking about propriety.
[605] property.
[606] But even, yeah, owning ownership.
[607] So like, with cell phones, you have chips that are really good at decoding video.
[608] Really, really good at it.
[609] And they're really tiny.
[610] And you know, they're also kind of cheap, but they're not open for everyone to use.
[611] So when you have something like the Raspberry Pi come along, they have to beg.
[612] What's a Raspberry Pi?
[613] It's a handheld computer.
[614] It's 35 bucks.
[615] Does everything.
[616] regular.
[617] Oh, yeah.
[618] It's crazy.
[619] Like on Reddit, it always pops up with some new thing that Raspberry.
[620] It's just a super cheap operating system, right?
[621] It's a super cheap computer.
[622] It's a super cheap computer.
[623] It's about the size of a credit card.
[624] It does everything that computers should do.
[625] So you can get on the net.
[626] You can use it to do.
[627] Credit card?
[628] It's credit card palm sized.
[629] It's palm sized.
[630] It's a screen as well?
[631] No, you need a screen, keyboard.
[632] whatever interfaces, it's like the box.
[633] It's the actual CPU, the whole deal of the box itself.
[634] Wow, it's credit card size.
[635] And they had to make deals with non -disclosure agreements and make all sorts of deals to attain the technology for their cell phone parts that they're using in there.
[636] That's not open to everyone.
[637] So, you know, in that same way, we believe that people create better when they get the keys to the candy store and they can actually experiment with all the.
[638] the parts that are being used and being developed as much as possible.
[639] We want to encourage that playing.
[640] That's interesting, but what I was concerned with more was not the people that are going to profit or not profit from creating these things.
[641] I was talking about people getting access to the technology itself.
[642] And it seems like that's inevitable.
[643] It seems like if it's effective for rich people, it will eventually trickle down to the rest of the population no matter what, as long as it's really worthwhile.
[644] But I think that when you've got a particular group of people always getting first access, they can always determine how it's used.
[645] And by the time the technology is democratized to a point where it, you know, first advantage doesn't matter, then there's something else because they're sitting on the capital from the last thing that came along.
[646] So you've got this perpetual cycle of, well, we're making it first, so we always get to dictate the terms of which this, this thing, the legal and technical terms, by which this thing or this type of thing.
[647] Right, but isn't that what ingenuity is all about?
[648] Isn't that why people spend so much money on companies to try to develop products?
[649] And isn't that like sort of the motivation?
[650] The motivation is to capture the marketplace and to be able to sell this incredible thing that they've created.
[651] So you're saying it should be locked down?
[652] No, no, no. I'm saying it shouldn't be locked down.
[653] But I'm saying I meant lockdown as in once something is created, basically it's available to everyone.
[654] It shouldn't be available just to this one person that created it.
[655] Yeah, I think so.
[656] And I think that's the most ethical way to do it in this case, because when you're talking about augmenting humans, you're talking about giving them additional information about the world.
[657] And this is our most powerful tool ever.
[658] And so when you're talking about supercharging this, you're giving individuals a great advantage over others.
[659] And with any other technology, you know there's abuses.
[660] If you look at nuclear technology, Right?
[661] The Soviet Union and the United States got to dictate foreign policy for how many years?
[662] Because they had that and other countries didn't.
[663] And I personally think that that's the scale of power we're talking about.
[664] When you're saying like neural implants and all of these really powerful devices, I think that the first people that get it get to dictate how everyone else plays.
[665] so you're essentially what you're saying is that these new technologies are going to be so powerful that once people get control of them they'll literally be able to enslave great groups of people who don't have them they'll have power over them they'll have advantage over them and they'll decide or could decide to use that advantage to control the monopolize the use of these technologies monopolize the use of these human improvements these technological human improvements and they'll just hoard them all of themselves Yeah, I mean, not in any...
[666] But has anybody ever done that with anything?
[667] Technology.
[668] Is anybody hoarded technology ever for themselves?
[669] It seems like if you can sell it, people sell it.
[670] Well, it never lasts long, but what I'm talking about are the rammed...
[671] The social ramifications of people trying to hold on to that for whatever.
[672] Because, for example, everyone, every army today uses guns.
[673] But we're still reeling from the cultural effects of one group of people having guns and another group of people not having guns.
[674] right we're still reeling from from you know countries hating each other and and ethnic tension and whatever so it's not it's not so much about the technology it's about this is what happens when you when you try to lock it down eventually it's going to spread but you're causing damage by trying to lock it down right that seems to be i don't know it's a it's a funny argument it's an interesting argument and it gets into a social engineering or socialism uh point of view where you got to go, well, how do you decide to distribute technology then?
[675] There are companies that, how is it handled?
[676] Like ATAFruit, ATAFruit, Spark Fund, they're open source, and they still make money.
[677] Well, it certainly is.
[678] I mean, that's, if any place, embraces open sources, the Internet and technology, and, you know, of course, Linux and Unix.
[679] There's always been an whole open source community online, but to dictate that that's mandatory, that's where things get sketching.
[680] I want I want I'm not saying that I don't want a future where corporations don't sell this I want people to have options I totally understand what you're saying but it goes against every other way we manage innovation we the way we manage innovation is people get copyrights and trademarks and you create something and that becomes you and then you own that and then you can sell it or license it out to other people what you're essentially saying is that when it becomes to a human benefit now no longer do you have the option now you're going to distribute it freely to everybody and i i say like you're you're regulating like in a way that has never been regulated before and that's a socialist idea and it's interesting it's interesting um it's debatable it's interesting and it's debatable but what you're what you're talking about is uh sort of a mandatory type of a thing well i i i don't want to i don't want to say hey google you have to make this this implant open source what i'm saying is i want a future where you can either get corporation or you can go to the open source community B. But isn't that inevitable?
[681] I mean, that's what you have with cell phones.
[682] I mean, you can buy an Android phone, or you can buy an Apple phone, or you can buy some no -name phone.
[683] I mean, eventually, if something is worth something, people sell it to the point where you're going to have leaders in the marketplace, but once the public has access, those leaders are based on the market itself.
[684] They're based on what people buy, what they like, what they enjoy.
[685] You can't monopolize if someone, if everyone's selling a similar price, product and one person's better, that's the one that succeeds.
[686] And that's sort of where we're at because of the fact that we don't just say when something comes along, like, hey, everybody has access to this.
[687] Fuck patents.
[688] Fuck trademarks.
[689] You know, that's literally against the way innovation has spawned human beings to this point in the first place.
[690] But there's other, I think there's other motivations than capital.
[691] I think that what the motivation in this kind of technology is so much bigger than money.
[692] What you're talking about is achieving the ability with technology to produce emotions or to create psychic states.
[693] That is an amazing thing, and that's an amazingly powerful thing.
[694] So is calling people.
[695] Yeah, but you know what?
[696] I'd rather have an orgasm helmet than a cell phone.
[697] But look, it's not either or.
[698] It's not either or.
[699] But I'm saying cell phones didn't get socialized, and this technology shouldn't either.
[700] Well, I don't think they're saying it should or shouldn't be socialized.
[701] I'm just, they're trying to make an open source version of this stuff.
[702] I don't think that Motorola is going to do that for us.
[703] I don't think they're going to do that for us.
[704] Who the fuck knows what they're doing?
[705] Who knows?
[706] But what I'm saying is if you make it, you shouldn't be entitled to distribute it to everybody.
[707] If you want to, but you shouldn't be, you shouldn't be forced into discharging it to everybody.
[708] I want a future of options.
[709] And in terms of the technology coming out, I want the open source and the closed source to come out at the same time.
[710] Because any, any lag time between the closed source and the open source, again, it starts to produce problems.
[711] So I want a future where people have options as to what they're going to get.
[712] And I think that we've already seen problems with technologies being pigeonholed so that, you know, we can't have, we can't not have a smartphone, but we know the fuckery that's going on with smartphones, right?
[713] And so there's, what do you do?
[714] Well, there's a lot of problems with smartphones.
[715] You've really want to get to the bottom of it.
[716] The real problem is they're all based on conflict minerals.
[717] The real problem in smartphones is that you follow every smartphone down to its source and you've got a little kid in Africa with a stick knocking rocks out of the dirt.
[718] And that's real.
[719] And they can't fix that yet.
[720] They really can't.
[721] They don't know what to do yet.
[722] So they need these conflict minerals and they're not available in very many spots of the world.
[723] And the places where they can get them the cheapest, that's where they get them.
[724] and that's why you have civil war in the Congo.
[725] That's why you have people fighting over these resources.
[726] So you've got a real problem, besides the innovation, you've got a real problem in the morality of actually owning a cell phone because everyone who owns a cell phone is a piece of shit.
[727] If you really get down to the core of it, I mean, this is just, there's no way around it.
[728] I think this is an unbelievably fascinating subject, and I think it's inevitable that we have this sort of discussion in this argument.
[729] And I think what we really need is the idea of capitalism or competition merged with morality and ethics and humanity.
[730] And instead, what we usually have is he who gets to the top of the mountain kicks everybody in the face that's trying to get up.
[731] And instead of like pulling people up with them and humanity benefiting and sharing something like that in an open source manner because you think it's the right thing to do, instead of mandating it, I think it almost should be a part of success itself.
[732] That success itself sort of generates altruism.
[733] generates happiness, it generates generosity.
[734] It should.
[735] It's like once you have some and you care about others, you should give, you should help, you should boost them up.
[736] And we don't have that attitude in this country, unfortunately.
[737] We have this ultra -competitive attitude which has spawned so much innovation because of it.
[738] But along the way, it's also made a bunch of fucking assholes.
[739] Yeah.
[740] You know, a bunch of heartless assholes just caught up in the hunt.
[741] Don't you guys think that it's eventually going to be impossible to keep anything private?
[742] Yes.
[743] that it's like to even to so we're entering into a future where there's really not going to be such a thing as secrets we're entering into a future where everything will eventually be leaked in some way everything will well i'll go you one better i don't think money's going to be real anymore i think we're going to get to a point where there's not going to be one zeroes are not going to cut it as far as you have this and he doesn't have that and i really believe that we're going to get to some weird place where just by nature of the the the just the progression of technology itself, the dissolving of boundaries, the access to information.
[744] Ones and zeros are all you have when it comes to money.
[745] Money is just information.
[746] It's going to get to a point where the idea, the paradigm that we operate under, we think it's important.
[747] I keep my stuff, you keep your stuff, I pay for this, you pay for that.
[748] It doesn't exist anywhere else in nature.
[749] We've decided that this is normal.
[750] We're going to get to a point in time where you're not going to be able to lock it down anymore.
[751] You're not going to be able to hold on to money.
[752] It's not going to mean anything.
[753] We're going to have to decide what the fuck to do with all the different shaped houses we're going to have to decide who gets the food but you're it's not going to be real anymore it's going to it's going to merge into some next level shit and that's the future of humanity i hope so too but then i hope people don't get lazy as fuck because of that well one of the reasons why people have done so well which is it's also arguable whether or not doing so well is a good thing but one of the reasons is that we've needed to succeed in order to survive i don't no how the fuck do you think we got out of caves well how do you think we didn't get killed off like Neanderthal.
[754] How do you think we didn't become food for the prey?
[755] We're weak and slow.
[756] Just because we innovated and we pushed forth and we got the fuck away from all the things that are dangerous.
[757] We locked up cities.
[758] We invented guns.
[759] We did a lot of shit in order to make innovation possible.
[760] Because if you go to places on the earth, like the Amazon or Africa, there's no fucking innovation.
[761] You're wearing a piece of skin over your dick and you're looking for something to eat.
[762] And that's what you do every day.
[763] You get up in the morning early and you go look for something to eat because if you don't, you die.
[764] My favorite example is the when we started growing crops right before most of our most of a human's time was spent looking for food right you don't have time to ask you know mom what is that because you were fucking either hunting something or running away from something that was just about your entire life when you are now growing food you're now you now have a wide space in your day where you're not doing anything even entire seasons where you're You've got food stored up.
[765] So what do you do?
[766] You start asking questions.
[767] You start making art. You start writing things down because you've got time now.
[768] I think that when I think technology should be about liberating people to do what they want.
[769] I'm sorry to cut you off.
[770] But what you're saying, this to me is one of the great naive ideas of transhumanism, which is the notion that if you remove from human beings, the need for something.
[771] In other words, because the ultimate goal of transhumanism, I think, is to shrink down the moment between what you can think you want and the moment of having it to nothing so that you instantaneously have something, whether it's by neurologically stimulating your brain in a way that's exactly identical to reality so you can experience any feeling state that you want or whether it's using matter assimilators to build something that you've contemplated.
[772] And all of humanity has been based on overcoming the obstacle between those two things.
[773] But in overcoming the obstacle, you gain wisdom.
[774] When you're learning how to play piano, you don't just get to learn how to play piano.
[775] You get to learn the discipline of years and years of working to play piano.
[776] And suddenly we can download into someone's mind how to play piano.
[777] If we remove that discipline, then what Joe's saying is you end up with slugs.
[778] But I think that, I think no matter how hard a dog tries, it can never.
[779] never learn calculus.
[780] There's just an upper limit.
[781] You just, no matter, it can study forever.
[782] And it just, it won't happen.
[783] And so I think that there are things that we just, as humans, we just, no matter how hard we try, there's an upper limit.
[784] And so when you begin, let's say, downloading information, there's some things you'll get easy.
[785] But there are other things that no one's ever discovered that you now have to discover, new ways to play that have never been even, you know, thought about ever.
[786] So I think that that when you augment, when you make yourself more intelligent, more capable, now you don't have to do hard work to do what humans can do, but you still have to do hard work to do whatever.
[787] If I can explain it in layman's terms, this is a technological version of mo money, mo problems.
[788] Exactly.
[789] That's what it is.
[790] Yes, you don't get it.
[791] You just make your problems far more complex.
[792] Like people thought, you know what, once human beings have supermarkets, you know what, once human beings have supermarkets, markets and you just go with a credit card.
[793] You don't even have to bring around a bag of gold.
[794] Then no one's going to be depressed.
[795] Turns out, people are more depressed.
[796] They're sad.
[797] They're looking for meaning.
[798] They're not hunting and gathering their food.
[799] They're missing all those rewards that are built into the human genome.
[800] They don't get them anymore.
[801] So what do they do?
[802] They take Prozac and they drive fast and they watch stupid movies.
[803] Yeah, can I put it in...
[804] Mo money, mo' problems.
[805] Let me put it in non -layman's burning man terms.
[806] If you take the Tibetan Book of the Dead, one of the bardos or one of the hells in the Tibetan Book of the Dead is called the hell of hungry ghosts.
[807] And what this hell is is these beings living in this weird place have this infinite appetite and anything they can think of to eat immediately appears in front of them.
[808] So they have the combination of always being hungry mixed in with being able to instantaneously create any food that they want.
[809] And that creates a hell state where all they're doing is feeding and eating to try to quench this endless, human appetite now obviously we're really far away that's probably coming though i mean if you wanted to think of something that someone would invent that's without doubt inside the realm of possibility we already have boner pills for old dudes on their death pill on their deathbed they chomped down a couple of pills they got a zombie cock we already have that you're telling me they're not going something where it just keeps your hunger going you never get satisfied you'll be hungry all the time what's your favorite thing to do isn't it eat mine too wouldn't it be great if you were hungry always and you could just be a professional eater all day yeah without a doubt some idiot that's how we design fast food think about what people have done to their lips people would do that people blown up their lips putting magnets in their fingers no offense they would definitely do that i i think that that that's if that's something uh negative about the human condition i mean if if we have that technology at some point then we would also have the technology to change it And that's another discussion, whether we should get rid of the hedonic treadmill.
[810] Well, I'm not even sure that it's a negative.
[811] I'm just, it's almost like a pattern.
[812] I mean, I think whatever it took to get humans to this point in history, whatever it took to get us to what we are now, it's an incredible process.
[813] I mean, from whatever we were, from multi -celled organisms onto this thing with laptops, that process is insane.
[814] And the idea that that process just changes because we add a change.
[815] chip and you know we it's no it's going to be there's going to be like a growing phase and then it's going to become something new just like it became this yeah if this is unrecognizable to some crazy nanitol 50 ,000 years ago and it's what we will be 50 ,000 years from now it's probably unrecognizable to us yeah and whether it happens slowly or quickly what has happened in our lifetime man that the idea of of the internet crept up on us so damn quick that we all just accepted as a fact, whereas if you brought any other time in history, it's the most insane piece of sorcery and magic the world has ever known.
[816] Just 20 years ago, it would have been insane.
[817] Just 20 years ago.
[818] Conjuring a dragon.
[819] The idea that you could have a little glass box in your pocket that answers all your questions.
[820] Amazing.
[821] You could send messages including video from people to people all around the world.
[822] It's insane.
[823] Everything about it's insane.
[824] Incredible.
[825] And it's just a part of our accepted everyday life.
[826] Slowly happen, slowly or quickly, whether it's be, you know, slowly, we think of it slowly in the course of human history.
[827] It's a massive burst of innovation.
[828] But in our lifetimes, it's been 10, 20 years, and then boom, it's all here.
[829] But the idea that you're going to stop it that, you know, or that we should even resist the next level.
[830] It seems pretty silly.
[831] I just, you can't resist it.
[832] You can't stop it.
[833] It's just, it's interesting to sit around and go, what the hell is it going to be?
[834] Yeah.
[835] That's where it gets weird.
[836] Yeah, and we don't know.
[837] But I don't, I don't think that primitive men.
[838] could have put a person on the moon if they were still worrying about catching that pig.
[839] No, you're absolutely right.
[840] Yeah.
[841] And whatever aspirations we'll have in 10 ,000 years, we won't get to that if we don't do certain things.
[842] Yeah.
[843] And I think that a part of a part of humanity that I think we should keep is our need to explore and understand.
[844] And I think that if we're still worrying about, you know, getting cancer at 60, we're not going to get there.
[845] Right.
[846] I think it's inevitable that we still keep exploring.
[847] I really don't, I don't, that's, it's why we're here.
[848] It's a whole part of the whole thing.
[849] There's no getting around that.
[850] No. You got a video to show us?
[851] Well, we've got something queued up with Tim when our buddy, our CTO, got his magnets installed.
[852] He got it on video.
[853] Would you like to check that out?
[854] Yeah, I want to definitely see that.
[855] Slop that out.
[856] Oh, Jesus, Tim.
[857] This is Tim?
[858] Oh, shit.
[859] Yeah, this is.
[860] Tim.
[861] This is Tim.
[862] Oh, my God.
[863] Holy shit.
[864] Don't know.
[865] What's that sound?
[866] Hold on.
[867] Pause that.
[868] Pause that.
[869] Why does that sound like that?
[870] What kind of soundtrack is on that?
[871] That's so stupid.
[872] Kill the sound.
[873] Just play that.
[874] Because obviously there's something going on behind it.
[875] It might even be a movie.
[876] Wow.
[877] Who's the guy doing the surgery?
[878] Some asshole.
[879] They found him at Home Depot.
[880] They were running around, touching lawnmowers.
[881] He's like, hey, I could do that for you.
[882] He's someone who does tattoo work, and he specializes in doing sub -dermal stuff.
[883] Wow.
[884] So there are a lot of people who do...
[885] Never trust a tattoo artist with shitty tattoos.
[886] There's a thing called a laser tattoo removal sign.
[887] Look into it.
[888] Never trust a surgeon with shitty tattoos.
[889] Especially on his face.
[890] Surgeons with tattoos in their face, boy.
[891] Avoid.
[892] You're going to go under in 10, 9.
[893] I'm going to fuck you in your house.
[894] What?
[895] What?
[896] Okay, so this guy is caught.
[897] and he's inserting this magnet into this dude's fingertip.
[898] It's a very, very, very tiny magnet.
[899] Yeah.
[900] Well, how would you describe it?
[901] It looks like a BB that's been split into like sixes, right?
[902] That's how small it looks.
[903] It's about two millimeters wide.
[904] Oh, that's bigger than I thought.
[905] Maybe a little less than two millimeters.
[906] Because that looks really tiny.
[907] Maybe it's just a perspective of this video, but that looks like well under a millimeter.
[908] What's the charge for this procedure?
[909] It's free.
[910] That's crazy.
[911] 100 % free.
[912] Fucked you up.
[913] Stick shit in your fingertips and laugh at you.
[914] It depends on who you get that stuff done from.
[915] But it'd be pretty reasonable.
[916] It'd be a bargain price anywhere between 75 and 150.
[917] Just like most subdermal body moths.
[918] What is, when you're talking about the future, when you guys sit around and you bring up what's on the horizon, what's like the big theoretical on the horizon?
[919] Does it have anything to do with, like, the kind of technology that we see in Google Glass being incorporated in, like, maybe a neural sort of a thing?
[920] So there was a paper put out a few months ago about connecting two minds of rodents.
[921] And so there was one rodent, if I'm not mistaken, in Virginia, and there was another rodent somewhere in Brazil.
[922] and they put this chip into the brains of these two mice and put them through an obstacle course.
[923] And with a roughly 60 to 70 % crossover rate, the other mouse actually learned the obstacle course of the other mouse it was connected to.
[924] Do you know if that's been done without chips, though?
[925] Rupert Cheldrick wrote a whole book about it.
[926] It's called The Morphic Field and Morphic Residence.
[927] And he wrote a book about dogs knowing when their owners are coming home and people being able to tell when people look at the back of their heads.
[928] And it's a fascinating sort of a theory.
[929] But he showed that, I don't know if it's been verified 100%.
[930] But I know he wrote that he showed that if you teach a rat in one place, a maze, that rats somewhere else, another part of the world will learn that similar maze or that same maze quicker, statistically quicker.
[931] right they've also shown the chimpanzees once they started venting things to do with tools chimpanzees they've demonstrated like um an orangutan in uh africa learned how to spearfish by watching people do it there's a picture of this thing doing it it's a motherfucker it's the craziest thing ever well other rangatangs started doing it too they once once they start learning things it's almost like it's in the ether it's very strange yeah this is a ring a ting separated geographically it's not like they could observe serve each other.
[932] Well, I don't know about orangutans, but the mice definitely separated geographically.
[933] That's what's odd about it.
[934] But you know what I think, man, I don't think it's that they're learning, they learn it and then it travels somehow through space.
[935] I think the evolution sort of grows through us.
[936] Do you know what I mean?
[937] Like it's something that's growing through humanity or these new innovations.
[938] So it's not as though one orangutan figured it out and sent out a signal.
[939] It's as though this was just a new phase.
[940] that was coming through this form of biomass.
[941] That's very possible, but the orangutan initially, I've confused these two stories pretty badly, but the mice learned in separate parts of the world, separate parts of, they weren't connected, whether it's a thousand miles or a hundred miles, whatever it was, but the orangutan were watching people.
[942] The orangutan started imitating people.
[943] So they did directly imitate human beings, but have you really stopped and think about when human beings used to be essentially very similar to orangutans, it was probably only like a couple of men, million years ago.
[944] That seems like a long time to us, but in the span of the universe, that ain't shit.
[945] So the idea that a champ could eventually, if they kept moving along and slowly innovating and changing and finding magic mushrooms and learning how to make fires with rocks, they could eventually become some kind of a human thing.
[946] I mean, we didn't used to be people.
[947] That's the fact.
[948] I mean, pretty much established.
[949] It's not like all of a sudden we popped up in this form, you know, figuring out, oh, we got to get away from leopards.
[950] But we became this, like slowly and steadily.
[951] It's just the time frame for us is so bizarre to wrap our heads around with an 80 -plus -year lifespan to try to understand what a 10 million -year period of development is.
[952] We can't get our head into that.
[953] We don't know what that means.
[954] And 10 million years from that with this stuff, when it gets to that weird exponential technology thing that's going on right now, we're going to be unrecognizable 100 years from now.
[955] Sure.
[956] Well, that's what I feel like Lucas is getting at by creating an actual, a pathway between two minds or between many minds that we can control using principles we already understand.
[957] I don't know intuition the way I know the way an electron is going to travel.
[958] I know the way an electron is going to travel.
[959] I don't get intuition that way.
[960] So if we can create chips Where I can have something in the back of my head And I go like this And Lucas is like stop scratching your arm Because I feel it Yeah Like that's an exponential Touching your butt How about that?
[961] You're sitting around your house All of a sudden, no Lucas Stop it!
[962] No You're on a date You're trying to keep it together You could get hacked with that kind of technology.
[963] Somebody could hack into you.
[964] That seems inevitable.
[965] Just like you get, there's like, someone hacked Twitter today, and I must have got a thousand, I'm not bullshining, a thousand fake tweets about diet and exercise.
[966] Oh, wow, I look so good for summer because I started doing this.
[967] Like those bots, like a bot got through and just infected Twitter.
[968] I think the benefit of having a bunch of people develop their own technologies because you're all affected together.
[969] Well, no, if you're developing your own technologies for your own implants, there's some sort of, there's some sort of level of security, like some people use Max, because, hey, Max don't get viruses the way PCs do, because people who want to write viruses are targeting PCs because there's more PCs around.
[970] I think that is a blinking oasis in the desert.
[971] I don't think they're going to be able to stop anything from getting hacked.
[972] No. Absolutely not.
[973] Well, first of all, we've learned about the NSA and spying.
[974] It's already shown that you can't, you can't stop it.
[975] They've already got algorithms in place.
[976] They already copying all your emails.
[977] Because of bottle -net technology.
[978] Because we can't invent our own internet.
[979] We can only use that internet.
[980] And that internet goes through certain filters.
[981] Well, it's also because these fucking corporations have given in to their overlords and given up the information.
[982] I mean, that's really where it comes down to.
[983] I don't want to live in a future where I've got artificial organs and all of the information from that is being pumped out to some, who the fuck knows.
[984] I don't feel comfortable with that I want that information to belong to me at least so that I can do it with what I want You know, I don't want any advertisements Because my liver is doing whatever Or I don't want people to be able to shut off my liver Because I'm not being a good citizen Can you imagine that?
[985] That's like an Ethan Hawk movie There'd be no Snowden Yeah, Snowden's brain would have melted out of his nose by now They just turn him up Man, I'm sorry you guys I gotta take a leak Go ahead, go to the leak.
[986] That coconut out water crushes me every time.
[987] Two weeks.
[988] Where do you think the end game is when it comes to all this biohacking stuff?
[989] What's the end game?
[990] Like where do you hope that it all winds up?
[991] Do you have like a man I hope by the time I'm X years old I can fly?
[992] I mean, is there something like that?
[993] I see an end game as something way larger than people screwing with their body.
[994] It's It's a whole bunch of different technologies that are happening all at once and the fact that we don't have enough resources on this planet to support the population growth and the amount of people that we do have.
[995] And those sort of things add together, I feel like when you look at a company like SpaceX and what SpaceX is doing and the fact that all these technologies are leading towards longer lives, the guy who's doing Soylent, the, food substitute guy.
[996] Methos dudes.
[997] Yeah.
[998] Those dudes in New York.
[999] Really awesome guys.
[1000] Yeah, isn't it?
[1001] I drank it.
[1002] I was like, what am I drinking?
[1003] I don't know.
[1004] And I think all that adds together.
[1005] And I think that, you know, when we have so many people and we're living a longer time and we have the ability to print anything we can conceive, then maybe it's time for us to start moving out to space.
[1006] Maybe it's time that we can handle 20 years over to somewhere where we'll be doing heavy construction work in zero G so that other people can come there and live and be smarter and move forward from there.
[1007] Maybe it's time to start space colonization.
[1008] That's where we're going.
[1009] Are we going to be cyborgs?
[1010] That would be way easier.
[1011] For some.
[1012] You could be like download your consciousness into a robot and send the robot into space.
[1013] I honestly think that's the, that's the smartest way to do it is to be a cyborg.
[1014] If you're going into space, that's what I've been saying.
[1015] Everything in space is is trying to kill.
[1016] any biological organism you can't you know apes in a tin can won't cut it I think that's what I'm saying and I think being able to withstand all of the crazy space weather space conditions yeah the conditions are insane I mean the temperature is instantly death if you step outside unless you have a pressured suit if you get hit by any sort of a solar flare or asteroid you're done we're too fragile even for this planet that's why we've got like this stuff on us And I think...
[1017] Motorcycle jackets and shit.
[1018] Yeah.
[1019] The idea of colonizing space as this, I don't think it's too dangerous.
[1020] Do you foresee a future where we really do download our consciousness into some sort of an artificial human being?
[1021] I don't know enough about the technology to say anything definite.
[1022] I think that it's a cool idea, but I've never seen anything compelling.
[1023] compelling compelling so i can't i can't really say it's a mind fuck out isn't it the idea of like a little you a little lucas running around made out of rubber and it's you your consciousness is in this thing it's out there touching things and moving stuff around yeah and also that you could duplicate a lucas like if you could download a lucas then your lucas could end up on the on pirate bay people could just be downloading you and then sending you pictures of their lucas that They've put in some kind of habitrail.
[1024] But that would definitely be another, that would be another Lucas, though.
[1025] Because once you start generating independent memories, I think you start having different emotions, different thoughts about things.
[1026] Yeah, that's going to be really weird.
[1027] Unless they all interface somehow or another together.
[1028] Which is, yeah, which I...
[1029] I mean, it is possible, right?
[1030] I mean, if you're dealing with...
[1031] Look, the idea of sending someone a video or talking to someone through FaceTime or Skype in real time off your phone, seems like the dumbest, the craziest thing ever.
[1032] Like, that couldn't be possible.
[1033] That wasn't even on Star Trek.
[1034] Oh, yeah.
[1035] They never even figured that out on Star Trek.
[1036] But it's real.
[1037] The idea that you and your three clones or whatever they are could be experiencing life simultaneously and that you could multitask and that the mind may evolve to concentrate on one person or another, these two sleep, or these two are out, you know, and you're doing two different things at the same time and aware, tuning in to either one, whatever you choose, sort of like touching your leg with your left hand where you're writing something with your right hand.
[1038] It's going to be a problem, man. You're going to have swarms of individuals.
[1039] There's going to be Rogan swarms.
[1040] Like, Genghis Khan.
[1041] Like, you know, Genghis Khan is responsible for, like, something like 5 % of the population's DNA because they just, you know, he went on a party in spring in the 1200s and the whole world's never recovered.
[1042] You know, imagine if you're doing that artificially.
[1043] Well, when you talk about stuff that's that powerful, all of a sudden it becomes really important who has control of that technology in the initial right but how do you enforce that it's a race it's a race hold on a second what what about him perfect opportunity to talk about it's cough because that's what he wants to create right he wants to create an army he does live forever did you pay attention to the global future 2045 conference yeah I know about the 245 guys what do you guys think about that um I like that that they're working towards a solid project because that's kind of rare in the transhumanist community.
[1044] There's just a lot of sci -fi talk.
[1045] Really?
[1046] And way too much.
[1047] Just way too much?
[1048] That's fine.
[1049] There's discontent in the transhumanish community and debating on how to handle this correctly.
[1050] He's got to know what the fuck he's doing with his robots.
[1051] But I like the fact that they have a goal that they're working towards.
[1052] I just I fear that that there will be a particular group of people that'll get it and then just fuck us for eternity.
[1053] That's your big fear.
[1054] Your big fear is someone else getting a hold of the goods first, not giving you the super chip.
[1055] I fear about the type of people that'll get it first because the world didn't fare well when a particular set of people got the gun first.
[1056] That wasn't good for a lot of the human species.
[1057] Well, it was good for us, though.
[1058] That's how you're here, dude.
[1059] That's how you're sitting on the internet with a plastic microphone in front of you, sucked out of the earth, converted from oil.
[1060] I would rather not have to move forward with that sort of contention and fighting and just, I don't think.
[1061] Is that inevitable?
[1062] I mean, with enlightenment, I mean, we have, whether or not we've achieved enlightenment, we certainly haven't.
[1063] But at the highest levels of humanity, I think people are probably more aware and tuned in today than at any other time in history.
[1064] and it's probably way safer today than any other time in history even though there's a lot of crazy shit that goes down on a regular basis if you compared your everyday life to that of someone living in Siberia in the 1 ,200s when the Mongols came storming in, we live a lot better it's pretty goddamn good and I think that a big part of that is technology and the access to information the access to information is freer than ever before technology more powerful than ever before hence people are safer and people are more moral I really do believe that.
[1065] And I think the idea of people being moral is more accepted as not just something that has to be enforced in order to keep the peace, but that's something that's beneficial and something that you should strive for.
[1066] It's admirable.
[1067] I think that that really might not have existed on such a mass scale in the past, even in the past 20 or 30 years.
[1068] I think we're dealing with like pretty unprecedented times.
[1069] And it sounds like really super, you know, optimistic.
[1070] But I feel like one of the byproducts of this technology that we're all experiencing is that we're experiencing a sort of mini enlightenment and a burst of enlightenment because of this information.
[1071] If that is the case, I think that this kind of technology that it sort of dissolves more boundary.
[1072] I don't think it's going to lead to a constriction of the people that utilize it.
[1073] I think it's going to lead to a freedom of the species itself.
[1074] I really do believe that.
[1075] I think that if I look at the trend, even though everybody wants to think the sky is falling, the trend doesn't indicate that.
[1076] The sky is falling for some people in some spots, and the apocalypse is right now, if you live in the Congo.
[1077] If you're in Liberia, the apocalypse is today.
[1078] That's Mad Max, right?
[1079] That's going on right now.
[1080] But the trend for us here, it's obviously not in that direction.
[1081] I mean, it seems like there's always going to be a worst -case scenario on this planet, but that worst -case scenario is far better today than it's ever.
[1082] been in history and the best case scenario was completely non -existent even in science fiction novels it's going in that way can't stop it yeah wait until one of these one of these people invent a swarm of nanobots that goes flying out of their basement devouring toddlers i mean the thing is like i i like the toddlers it's a fun word to say that's your thing you always go with the dead baby but but no i think that when you're talking about this kind of stuff you're you're talking about acceleration.
[1083] I think you're talking about a technological acceleration.
[1084] And wherever there's acceleration, there's an increased chance of hurting yourself.
[1085] If you're on a skateboard going a few miles per hour, you're fine.
[1086] If you're on a skateboard going downhill, you can wipe out.
[1087] Well, humanity is on this technological skateboard that's exponentially accelerating.
[1088] And in that way, the role of the individual becomes more and more and more intense, which means that with this kind of stuff, when the individual has access to the kind of technology that they have today, they can do a ton of damage.
[1089] Look at what happened with those freaks.
[1090] All they needed was a pressure cooker and some ball bearings.
[1091] And they permanently, they killed so many people and traumatized so many others.
[1092] In the same way, five years from now, what happens when people have access to this kind of stuff?
[1093] It is, we are setting ourselves up.
[1094] But you're talking about very unique in individual circumstances, in comparison.
[1095] comparison to the 7 billion people experiencing things completely differently all around the world.
[1096] The problem is that you're dealing with 7 billion people that have access to the internet, or at least a good percentage of them, and have their stories being told.
[1097] So you're hearing about instances all over the world simultaneously all at once.
[1098] We are not designed for that.
[1099] We're designed to take information from our local community.
[1100] We're designed to take information from, oh, there's a band of dudes.
[1101] They're about a mile away on foot, and they're coming here to fuck our women.
[1102] that's normal.
[1103] That's what we're supposed to deal with.
[1104] You're not supposed to deal with a story about a guy in Switzerland who fucked his pig to death.
[1105] You know, you're not supposed to deal with this guy in Detroit that's got a fake Bigfoot in his cooler and out back.
[1106] These are all, you know, this is just you're dealing with so many human beings, an impossible number of seven billion, and you're getting information from a vast number of those people.
[1107] So you're always going to get these freaks.
[1108] You're always going to get, but it's still the trend, the overall trend, is way better than it's ever existed.
[1109] And I agree with you there.
[1110] My hope would be that we become more ethical before we get the power to cause damage.
[1111] So even though we're more ethical now, any ethical mishap at this point would be a lot more disastrous.
[1112] It's not so bad when the most powerful weapon you have is a sword.
[1113] Yeah.
[1114] Well, that's not true, because Genghis Khan kills 70 million.
[1115] people with a horse and a sword and a lot of people a lot of other dudes with them but they did it with horses right 70 million people over his lifetime with horses but if the the scenario is you have 3D printers and let's imagine 3D printers in 200 years where now you can like molecularly assemble stuff and somewhere along those 200 years someone discovered antimatter right so let's imagine that there's a way someone could create an antimatter that when it met matter would dissolve the universe.
[1116] Let's just say there's an imaginary technology.
[1117] They could open up...
[1118] And you know what's going to happen?
[1119] A few people are going to die, and then it's going to force everyone to learn how to be really nice to everyone.
[1120] Because everyone can create a fucking anti -matter bomb in their computer.
[1121] That's what I hope.
[1122] That's right.
[1123] Yeah, that's what I hope.
[1124] That's what I mean, they say a well -armed population is a polite population.
[1125] I mean, we're going to take it to the complete next level.
[1126] Anti -matterly is a polite culture.
[1127] Black holes all throughout our planet that were created by assholes.
[1128] We're going to have to hop over on our way to 7 -Eleven.
[1129] There's a black hole over there.
[1130] Oh, thank you.
[1131] Something in transhumanism that people don't talk enough about, I think, is making people more ethical.
[1132] I think there's only talk of making people more capable and more intelligent.
[1133] But there's no talk of making people nicer for the sake of just living in a more sane world.
[1134] Because we're nice now, but what happens if infrastructure goes away?
[1135] And we become hungry again, and we've got lots of weapons around.
[1136] What happens?
[1137] Yeah.
[1138] And it would take a longer period to build back up in this state rather than if we were nicer and had forethought instead of having hunger automatically take over and be like, well, fuck that guy, he has food, you know, I think that we should start having a discussion about making people more ethical and being able to empathize with even other species because who knows what we will run into in the future and that might be important.
[1139] Well, that's interesting you brought that up because that was something that Dmitri Yitzkoff actually touched on in the conversation that I had with him where he was talking about that being an important part of this whole movement and that the movement wasn't just about achieving some new technological state.
[1140] It's about elevating humanity as a whole overall.
[1141] And I thought that was really interesting, that he is taking that into consideration.
[1142] That's one of the reasons why in this global future 2045, that's what it was, right?
[1143] Global Future at 2045 conference that he had, he brought in a lot of religious and spiritual leaders to sort of ask some questions of these different faiths to try to get an understanding of what their philosophies were and how they would incorporate this sort of new, impending technology, this transhumanism idea, were it to come to fruition?
[1144] It was really pretty fascinating stuff.
[1145] I mean, he didn't just take it from the technological standpoint.
[1146] Hitzkopf really dove, he dove into the psychological and the spiritual aspects of it as well, which I thought was unique and admirable and probably pretty important if this thing moves further.
[1147] You're going to have some real ethical questions.
[1148] You develop immortal cyber beings with skin made out of spider web silk that's bulletproof.
[1149] I mean, there's already the transhuman Mormons.
[1150] They've already figured out how to do that.
[1151] Right.
[1152] Yeah, the transhumanist Mormon associate.
[1153] What?
[1154] What did you say?
[1155] Yeah, that's a real thing.
[1156] It's like saying gays for Jesus.
[1157] Yeah, there's...
[1158] That's hilarious.
[1159] Transhumanist Mormonism.
[1160] Yeah, there's a subset of the Mormon community.
[1161] The problem is if you're willing to be a Mormon, you're fucking willing to be anything.
[1162] It's just a matter who rings your doorbell first, you know?
[1163] It's like just sit them down and talk them into anything.
[1164] You're a Mormon.
[1165] You believe a 14 -year -old named Joseph Smith in 1820 found golden tablets that contained the lost work of Jesus.
[1166] And only he could read him because he had a magic rock.
[1167] no wonder why they're afraid of gay marriage your family because if someone can talk you to being a Mormon they can talk you to blowing them too that's the joke get it it's right out of my stand -up goddam you got me doing my own material you're a part of a movement right now whether you call it grinders or transhumanism or whatever you're part of a movement where you're focusing on a very specific thing you're focusing on incorporating technology into the body to improve the body and as long as you're focusing on that, as long as there's a whole community focusing on that, whether or not now it's just magnets in the fingers and everybody's poo -pooing it and saying, what's the big deal, it's going to come a time where it's a lot more than that, whether it's some new invention of something that has absolutely no rejection in the human body so you can add all kinds of things, whether that some new technology comes along that radically enhances perception, whether it's visual or hearing or thinking, cognitive function, the ability to read each other's minds as long as you both have the chip, it's all going to happen, right?
[1168] Like, you can't text message someone who doesn't have a phone, okay?
[1169] But if they have a phone, you can text them pretty easy.
[1170] And once we all have something like that in our head, and it's like, your phone will work with your friend's phones.
[1171] OMG, I'm looking at you right now, and I'm writing this down.
[1172] There's going to be those moments where we hit some next -level thing that we didn't see coming.
[1173] Just like we didn't see the Internet coming, we didn't see cell phones coming.
[1174] No one saw that shit coming.
[1175] Again, even Star Trek, they had a fucking, walkie -talkie, man. Kirk out.
[1176] You have to, like, say that and shut it.
[1177] He had to flip it open.
[1178] It's so stupid.
[1179] They didn't see iPads.
[1180] They didn't see any of that shit.
[1181] And you guys are right on the crest of the wave, right on the crest of the technological tsunami, getting sucked into the singularity.
[1182] As we're all getting torn towards that waterfall, you guys are in the canoes at the very front.
[1183] How far would you be willing to take it?
[1184] If someone came up with some real Steve Austin, six million dollar man, arms and legs they just had to saw your shit off and put on some new ones would you be down if you knew if they're like a whole bunch it was like fake tits where everybody was doing it if it got to that point i think if everyone was doing it we'd all do it yeah came up to like a woman in the 1800s came up to a pilgrim or something and said listen i like i like what you look like but i'm thinking maybe some bags of water surgically implanted under your breasts would you know you get a lot more tension around the town.
[1185] They didn't even have bags.
[1186] They didn't have plastic.
[1187] But they would think you're fucking crazy.
[1188] Are we going to think, I mean, is that us looking at the future about, like, it's going to be standard?
[1189] Like, you still have your arms and legs?
[1190] Oh my God, girl.
[1191] What are you doing?
[1192] What is that?
[1193] Yeah, that'll be the end of rape.
[1194] Everybody's just a fucking super robot.
[1195] It'll be the new retro thing, though, to have arms.
[1196] Everybody could fight off everybody.
[1197] Yeah, I think that it becomes an expression of yourself.
[1198] Kind of the way that body modification is already.
[1199] You know, why would I replace perfectly good arms and legs when I could have extra or maybe something, something extra.
[1200] Now you're greedy.
[1201] See, greedy, motherfucker.
[1202] Trying to give me a super body, like, how about I have a three dicks?
[1203] But by the way, man, you got it.
[1204] Forms.
[1205] Forms would be great.
[1206] Cheva.
[1207] Maybe that's what Cheva's all about.
[1208] Maybe that is the future.
[1209] Sheva is the robot.
[1210] Holy shit, I didn't even thought of that.
[1211] Maybe that's what that was all about.
[1212] Some crazy transhumanists from the past.
[1213] Maybe they figured it out a long time ago.
[1214] and they just barely wrote it down they were so high they rode down like a few passages in the Bhagavagata and then went back to heat mushrooms hold on six on is that what it says it but I do like that you ask somebody who has magnets under his hands if he would replace his arms with robot arms the answer is yes of course you would do it in a second even if no one else was doing it but what if you couldn't enjoy like foot massages anymore would you replace your legs if they were just numb they were numb but awesome like jumping over buildings running 60 miles an hour but they were numb, you know, you don't feel a good massage.
[1215] Are you saying that if you had to pick between a joint foot massage?
[1216] I don't, grabs your butt, you feel nothing.
[1217] I don't, I don't think it has to be a trade -off.
[1218] It might have to be, but if you keep your cock and balls, but the bottom of your sack is numb.
[1219] The only only way feel your balls is if someone squeezes them, but the legs on down, fake as fuck, super powerful pistons, and, you know, nuclear power, jumping over trees.
[1220] So, wait, you're saying I could jump over trees, but my Balls are numb.
[1221] Only the bottom.
[1222] The bottom of your balls are numb.
[1223] And take numb balls.
[1224] Yeah.
[1225] I'll take all the way to the top.
[1226] Would you think you would be willing to go through an operation that removes your legs?
[1227] I mean, what if they'd get operations like, look, a long time ago, just something like a fake tit, you'd probably die.
[1228] If someone tried to do that to you, you would die just from the fact they didn't know how to sterilize you.
[1229] You didn't have any way of putting you under.
[1230] Sure.
[1231] it would be nothing today but back then it was a serious serious thing today girls go back to work in a couple of days right the idea I don't know why I keep going back to fake tits I'm trying to find a non -sexist way of approaching this in far as male enhancement but I can't find anything that's as standard as the craziness of male of female breast enhancement it's one of the weirdest things human beings do but to try to if we get to some point where amputating legs is like nothing yes and replacing it with an artificial one super powerful I think that the difference between a breast implant and fake leg, right, is that already my knees are given out.
[1232] Already, my legs are going to wear out.
[1233] Stop right there.
[1234] Have you ever seen a chick that's had three tits, three kids, rather?
[1235] Have you ever seen a chick's tits who's had three kids?
[1236] Everything wears out, man. We're made out of flesh.
[1237] It would have been way better if I said it smoother.
[1238] Yeah.
[1239] It was a missed opportunity.
[1240] Staying again.
[1241] I stumbled.
[1242] No, sweet.
[1243] We do wear out.
[1244] You're totally right.
[1245] There's nothing different between those two, but we wear out.
[1246] Yeah.
[1247] Eventually, your choice might be shitty legs or better legs that weren't better.
[1248] Or maybe they fix the undersack and, you know, come up with some new technology that even though the undersack is artificial, it has sensors and it sends it.
[1249] I would just like everybody here to stop pretending that if they had super powerful robot legs, you wouldn't get your legs amputated and replaced by super.
[1250] But I wouldn't be, what about the fact that they're numb?
[1251] That's a weird thing, man. I would want someone to do it for a few decades.
[1252] You're jumping over trees.
[1253] People already working on prosthetics that interface, yeah, that feel.
[1254] People already, and we're working towards more sensitive prosthetics so that you can feel more.
[1255] And you can pretty soon, I'm hoping that you can even dial it so that if you're going to do something dangerous or painful, you can dial the pain down, whatever, it becomes optional.
[1256] At that point, why would you keep these?
[1257] These aren't modular.
[1258] You can't switch them, they don't rotate.
[1259] Yeah.
[1260] Why?
[1261] Why would you, why?
[1262] Because they feel good when you hold someone's hands.
[1263] But you can, you can always put over a layer of soft stuff.
[1264] Why not just pretend you're alive and jump off a bridge?
[1265] I mean, it's at what point in time do we end this?
[1266] Okay?
[1267] Your whole body is going to be this fake robot body and you're going to take what is the memories of your life and your consciousness and download them into some computer chip and the you rots and this dream you goes wandering around until your batteries run out.
[1268] yes thinking you're really living on a farm with the Walton's meanwhile you're a stupid thing with a battery you don't even know what the fuck's going on with your life meanwhile you're some you're like holding hands on the beach and getting foot massages while all of us are bounding through the universe jumping over buildings well I'm just playing devil's advocate but there always are those people that want to hold everything back and go what like those assholes that we were listening to science has yet to explain how a seed becomes a beautiful flower.
[1269] Yes, actually, it has dummy.
[1270] You know, we were listening some of these religious people talking at that conference were just like, their ideas were so stupid and antiquated, clinging on to the mysteries that have already been solved.
[1271] Right.
[1272] But in what we were talking about before, those mysteries that have been solved, when we do understand the mechanism behind a seed becoming a flower, make it even more fascinating.
[1273] Because even though we understand the process now, it's still wow that's cool before it used to be uh while that flower is beautiful what a mystery i am so glad god brought this to my life now we're like isn't this incredible this seed it got oxygen and then minerals from the soil and then it converts it into energy and then it's photosynthesis and it grows and then the flowers blooms this is insane it's like the whole process behind it's like it's like so mind -boggling and enriching like and as a human being that's one of the things that like really jazes us up when we learn new shit when we discover new shit it's part of like the like when you tell someone something go whoa that's cool like we were talking earlier today about this um this discovery these uh scientists uh in uh germany and um in germany and america uh they have discovered um this new type of animal that used to be like an ancestor of the human being that's from 41 000 years ago they thought it might be a neanderthal they thought it might be a homo sapien and it turned out to be some new thing that they didn't even know existed it's called they're calling it a denis solvan based on where they found it and when we were talking about it before we were all like whoa cool wow like there's a there's a part of learning something new and discovering something new that just jazzes us up yeah and that's one of the things that brought me to transhumanism the the desire to see the milky way i mean learning that we didn't live we couldn't live long enough to explore the galaxy was just To me, as a kid, was just...
[1274] Devastating.
[1275] Yeah, it was devastating.
[1276] That's when you know you're a super geek.
[1277] Because, like, you know, I grew up watching Star Wars and, you know, watching Doctor Who.
[1278] And then finding out that we're too fragile and we don't live long enough to see anything.
[1279] It's like, we can't...
[1280] What we do is we say, what's over that hill?
[1281] What's over that mountain?
[1282] Let's go.
[1283] I know it's dangerous, but, hey, there's nothing else to do.
[1284] Let's do it.
[1285] And the fact that we can't cross space because...
[1286] there's just too much radiation or there's you'll develop wrong or whatever that to me that was devastating and unacceptable and i think that that that's just what other option is there but to move forward that's so funny that devastating unacceptable do you heard about the um there's a gentleman of the university of connecticut that's the foremost scientist uh working on time travel roy mallet i believe is it ray mallet i forget i forget his name um but he's a professor and he's a professor and he's started his work on time travel because his father died when he was a young man and he wanted to go back in time to save his father so this guy it's like a goddamn character in a comic book he's like character in a spider man book and he's been working on time travel to go back in time and save his father cool that it's those those things like where you're like i can't imagine that i'm living in a world where i won't get to fly in a battle star galactica ship and fight the sylons this is ridiculous I think that's what's beautiful about transhumanism is that it has the impulse within it is the same impulse that got people to go on to ships to sail towards a continent that they kind of heard about at the risking everything with no food, probably going to die, maybe the earth just ends.
[1287] I think there's something beautiful about what you guys are doing, even though I do see it as a kind of self -destructive act.
[1288] I think when you put magnets under your fingers, you aren't thinking.
[1289] ahead you aren't thinking about what you're going to be like in 30 years you know you don't know you says the dude who used to have a nose ring for like three days and i didn't get the nose ring infected and almost killed you you self -destructive bastard the impulse that to get me the nose ring was not the same impulse that drove explorers through the sea it was just being dumb and high in venice beach there's no glory to it what these guys are doing is glorious it's really cool it's just a kind of low it's a bit you have to start somewhere you know You got to start somewhere.
[1290] You're starting with magnets in your fingers.
[1291] By the way, if anybody's listening to this, the scientist was Ronald Mallet, Ronald L. Mallet, M -A -L -E -T -T -T, he's a PhD at the University of Connecticut.
[1292] Would you like to meet Dmitri Yitzkoff, the man who founded the Global Future 2045 conference?
[1293] And if so, what would you ask him?
[1294] You first, please.
[1295] What would I ask him?
[1296] I would ask him.
[1297] Or what would you say to him?
[1298] I mean, it doesn't even have to be a question.
[1299] Would you have anything specific that you would want to talk to him about?
[1300] I would actually ask him what his vision of the future is.
[1301] If it all pans out perfectly in his view, what would it look like?
[1302] Because I'm very interested in what his utopia is.
[1303] I'd actually love to meet him.
[1304] What about you?
[1305] Would you ask him?
[1306] I'd like to meet him.
[1307] I'd like to know what he thinks about where we're going as a whole, as a group of people who are experimenting with ourselves and doing this stuff.
[1308] And I would ask him for the resources that we don't have.
[1309] When you talk about how putting a magnet inside yourself, you can't think about the long term, well, it's not that we haven't.
[1310] It's that the people we talk to who are professional.
[1311] in these fields, they can't risk telling us the answers because they have professional certification boards that would say, hey, you're behaving unethically.
[1312] You're telling someone to do something harmful to themselves, and they don't know the risks they're taking.
[1313] We're yanking your license.
[1314] And so we don't get answers to questions that we need.
[1315] And I think a guy like Itzkoff could probably put us in touch with a lot of people who have a lot of answers for things that could really help us out awesome dudes thank you very much really fun conversation uh really interesting stuff fantastic subject and uh thanks for doing what you're doing thanks for sharing your information as well and thanks for having us you guys have a great attitude about all this technology too and uh even though i played devil's advocate about it i really think that ultimately that would be the best way you know if all we all had access but the problem is you know people are weak bitches i'll tell you grind house wet wear ladies and gentlemen this podcast would have good night