The Joe Rogan Experience XX
[0] five four three two one and we're live james what's up how are you man i'm great don't freak out about your your sound of your voice in the headphones this is the first time you ever worn headphones or on a podcast definitely the first time i've heard myself talk is it weird after a second yeah it's weird you get over it pretty self -conscious about it really you gotta be all right you can take them off you want if it's freaking you out too much you think you're gonna get through this Let's just take these fucking stuff, man. We don't need these things.
[1] Just keep this sucker close to you.
[2] You'll be fine.
[3] So, first of all, thanks for doing this.
[4] You've been on this crazy sort of whirlwind tour.
[5] Have you gone anywhere, or you just been doing it mostly from your house?
[6] Mostly from my house, just on Skype.
[7] Now, for people who don't know the story, let's give them the short version of it.
[8] You were working at Google, and what prompted you to write this memo?
[9] Yeah, so they would have these company -wide meetings where they just push a lot of this diversity stuff, and some of it was kind of weird, so I decided to go to these secret meetings sort of that were about 100 people, completely unrecorded, and they would talk about some of the things that they're doing, and it would really contradict what they're saying publicly, where, oh, no, we're not, you know, changing any of our hiring practices for these candidates, and they said, yeah, we basically are, uh, making it easier for some candidates to get in.
[10] And, you know, I voiced some concerns, but people just chained me and was like, no, you're wrong.
[11] You're just, like, have white male privilege.
[12] They said you have white male privilege?
[13] That was the actual word they used?
[14] Yeah, there's a lot of that going on.
[15] And so they asked for feedback on the program.
[16] So I wrote this document to clarify my thoughts.
[17] I sent it to them.
[18] They looked at it, but, you know, they just ignored it, never told me anything.
[19] So I went to a couple more of these programs, and, you know, I gave similar feedback.
[20] I gave the same document.
[21] They kept looking at it, but just never said anything.
[22] And, you know, I would send it to random people that I knew, and half the people would be like, yes, exactly, this is what I've been thinking.
[23] And the other half would maybe disagree with some points, but it would never be, you know, emotional outbursts or anything.
[24] It would just be like, oh, are you sure that this is actually happening?
[25] It's like, yes, because, you know, I've actually been to these unrecorded meetings.
[26] meetings.
[27] This is what's happening.
[28] So if you could get into specifics, like when you're in these meetings and they're talking about diversity, what is their concern and is it they're trying to promote an image of diversity?
[29] Are they trying to promote actual diversity?
[30] Do they think that there's a benefit for diversity or is it a part of their public image?
[31] And is it a lot of it to avoid criticism?
[32] Because I think there's a big issue with, I mean, if you don't have all your bases covered to black women, to Asian men.
[33] If you don't have all your bases covered, you can get like pretty roundly criticized as not being diverse or being possibly racist.
[34] And when you do that, you're kind of fucked.
[35] Yeah.
[36] So Google definitely has a huge target on spec.
[37] And so there are people that want to complain that, oh, Google is not diverse, therefore it's racist and sexist.
[38] And so that's a lot of their fear.
[39] They look at their representation and then compare it to the overall U .S. population and say, oh we only have 20 % women we should have 50 % there's obviously some sexism happening and so a lot of their stuff is oh we need to fix this because you know all this sexism is bad and obviously if you disagree with sexism that's of course bad and I obviously don't you know want there to be any sexism but I just don't think that that's the sole cause of this disparity in representation yeah Yeah, it seems like in the interest of promoting an image of diversity, they're willing to bypass science and the truth and the reality of culture, the reality of human biology and evolutionary psychology.
[40] There's just so much that they're willing to look past to get to this one thing, which seems to be like this really important thing today's society, that you want to promote an image of diversity.
[41] yeah that's it's more important than anything so like when you're in these class or these i mean it wouldn't call it a class what would you call it a meeting whatever they are yeah some of them were classes some were you know our day long programs and so they would teach you things like what would they teach you they would talk about unconscious bias oh no oh like you like you might be racist you have to find the racism in you and yeah there's a whole program that's trying to retrain your brain to think to think about race in a new way or something so they're just assuming you're guilty pretty much because you're white i well yeah i mean they look at the representation and say racism sexism do uh black people have to go to this i i mean they no one has to no one has to they are definitely pushing it on people and uh now managers are being evaluated by how well they promote diversity and inclusion and you know it's just a slippery slope and I think it'll eventually become part of our performance review so if you're a white woman do you have to go to this I mean do that are you encouraged to go to this or are you like hey you made it through like this is what we've been looking for you're fine or if you're an Indian woman even better right is that how it works or would you still have to go there and approach your unconscious biases yeah they say everyone has these unconscious biases.
[42] Even towards white people?
[43] So do they have those where they have like black people with their unconscious biases towards white people?
[44] So they never acknowledged that anyone could be racist against white people.
[45] Of course.
[46] Why would you?
[47] Yeah, it's all this like if you have, you can only be racist if you have power or sexist if you have power.
[48] They believe that?
[49] The racist part?
[50] Yeah, I think so.
[51] Well, that's insane.
[52] That's a redefinition.
[53] It's a very recent redefinition of the term racism, but it's very slippery and very dangerous because you could see it as promoting, in fact, like, exonerating racism towards other ethnicities or towards white people or towards people that you feel like are in a privileged class, you can get away with it because it's no big deal, because they're the ones who are racist.
[54] Even if it's not even that person, if it's people who look like them that have lived for centuries, like somehow or another, you're a guilty person with your own terms, with your words, with your ones, with your racist.
[55] Even if it's not even that person, white privilege so like what would they tell you when you would go to these do you did you like express some discontent or yeah i didn't i my main concern was them saying 50 % in the population look google only has 20 and so we're obviously yeah yeah and you know there were clear reasons at least in my mind that that's not as simple as they're making it out to be And that, you know, there are some differences, and that could explain some of the issues that women are facing.
[56] And so a lot of these women issues in tech, I feel, are actually not really gender issues.
[57] They're just, you know, women on average are more cooperative, for example.
[58] And so they may find it harder to, you know, lean in in the corporate world like Cheryl Sandberg is saying.
[59] But, you know, there are men that also feel like that.
[60] I'm not very assertive.
[61] I'm actually pretty shy.
[62] And so I feel the same stuff.
[63] It's not that, you know, there's a ton of sexism.
[64] It's maybe that male typical behavior is rewarded just as, you know, competitiveness is rewarded in a lot of corporate world.
[65] But it's not that we're just, oh, you're a woman, therefore you're obviously bad at coding.
[66] You know, no one is ever saying that.
[67] Right.
[68] I think there is absolutely an issue with assertive women being treated very differently.
[69] than assertive men like an assertive woman is a bitch like you don't want to be around them that's like the the bias and uh that's a real issue i think for women that want to enter into any sort of a competitive field and you know where a man would be assertive if a woman does the exact same thing she's looked down upon yeah she's looked upon like a problem woman or like someone you don't want to work with whereas the guy is just ambitious yeah although some people will twist that and say that because i mean a lot of it is just they try to fit their ideology and they see one data point and they extrapolate so they see these studies and it's true that these women are viewed as less likable but they are seen as just as competent and so their performance reviewed isn't affected really by being assertive it's just that socially they may not be as liked as much right but that's got to be a factor in the way they behave because yeah for men a ball -busting successful man is supposed to be like looked up to like oh this is the guy who's kicking out in the corporate world.
[70] He's doing it right.
[71] Like, you know, Bob is ruthless.
[72] But if Jenny's ruthless, like, you don't want to be around her.
[73] You know, it's a weird, it's just, that's, I feel like if there is a real bias with men, obviously I don't work in tech, but I would assume that that would be a real bias.
[74] Yeah.
[75] And I mean, I think some of the solution to that is just allowing people to be more cooperative.
[76] And, you know, actually, so for example, at Google, you're really rewarded for owning a particular project and seeing that one project go through.
[77] But if you're someone that is just, you know, can really help a lot of different people and you're not necessarily the sole owner of any individual thing, but you're, you provide a lot of value to the company that isn't really seen as positive as someone that really drove the project alone.
[78] That's interesting.
[79] Yeah.
[80] That seems like a bad thing for teamwork.
[81] Right.
[82] Is that just a bad philosophy or something that got stuck in the way the system works or I think it's sort of just it's hard to evaluate if I did you know 10 % of my time on 10 different projects and I helped that makes sense yeah so you'd have to essentially trust the workers instincts and work ethic and yeah now the blowback from this has been very intriguing you know as an outsider like looking at it for when I first heard about it you know I thought well this mean angry man Must have written some things saying that women suck at tech or they suck at this and, you know, and people are reacting to this blatant misogynistic tribe that I, or a scribe that I was hearing about.
[83] When I read it, I was so confused because I was like, where's the mean stuff?
[84] Like, where is this?
[85] And you also think, the other thing that was really confusing was that some people were reprinting it without citations.
[86] Right.
[87] Did that freak you out?
[88] Like, when you're being misrepresented?
[89] Yeah, especially when people would say, oh, it was so unscientific because it didn't have citations.
[90] Right.
[91] And that was their entire argument.
[92] Who did it?
[93] Who printed it without citations?
[94] Because some major publications republished it.
[95] Yeah.
[96] I think it was Gizmodo or motherboard or something.
[97] Why the fuck would they do that without citations?
[98] It seems so unreasonable and so irresponsible.
[99] I think a lot of these companies just have a certain narrative that they're trying to push.
[100] Yeah.
[101] And so even, you know, I've tried to talk to a lot of these reporters, and I'll give hour -long interviews with some of them.
[102] And at the end, they'll just write the same sort of article of like, oh, yeah, he's just a misogynist.
[103] Yeah.
[104] And so I think even if I can convince the individual journalist, they are under pressure by their boss to write a certain type of article.
[105] God, what a weird world we're in right now when it comes to that.
[106] Because I was looking for something that could be, could be, like, evidence of massage.
[107] The only thing that I could find, and this is a very mild criticism, is that you were saying, I believe you use the term neurotic, that women were more likely to be neurotic.
[108] Neuroticism.
[109] Yeah, yeah, yeah.
[110] That's one where a lot of women go, well, fuck this guy.
[111] But that's it.
[112] That's all, I mean, but what did you base that on?
[113] Yeah, so there's the psychological Big Five personality traits, and neuroticism is just one of them.
[114] Right.
[115] So that's the actual term that they use, and it's sort of unfortunate that that's a term.
[116] Yeah.
[117] Yeah, it's, that one, I feel like maybe you could have danced around that a little bit.
[118] Yeah.
[119] But that's it.
[120] I think it's just, I was too much into the, like, I've seen.
[121] seen the words so often that I didn't really associate it with neurotic and the negative connotations.
[122] Yeah.
[123] Well, I've seen a bunch of your conversations.
[124] I've listened to you, talk to Ben Shapiro and a couple of the folks.
[125] And, you know, your thought process is very reasonable and very well sorted out.
[126] And another thing that I'm not hearing from anybody is how you wrote a whole page and a half describing all the different ways that women could be more involved in tech or you can encourage more women into tech like this is not the work of a misogynist this is the work of someone who's carefully considering an issue and looking at it from a very what what I felt like um and correct me if I'm wrong but that you felt frustrated that you were looking at something that was uh that people the way they were approaching this wasn't they weren't looking at it for what it was they had kind of decided how they were going to describe it and how they were going to deal with it.
[127] And it wasn't really based on facts or reality and certainly not on science.
[128] And you sort of felt frustrated by this and you decided to try to interject with as much of the current science as you could that could possibly explain choices.
[129] Not why women are bad at it, not why they shouldn't be in it, which is what I kept reading.
[130] But more that why women choose to go into certain professions, what What could be the impediment and what we could do to maybe encourage more women to do it instead of doing this sort of blanket style diversity where you just like, oh, we need two of these and we do two of those, which is what I seem to think that they were doing.
[131] Is that a good assessment?
[132] Yeah.
[133] By the way, this will never trend on YouTube.
[134] We might get five million hits.
[135] I mean, that's a real problem too.
[136] Right.
[137] There's a lot of censorship when it comes to these sort of conversations.
[138] Like, they would rather look at me, who looks like a meathead, and look at you and go, oh, well, these fucking guys are just talking shit about women for an hour.
[139] You know, I mean, right?
[140] I mean, do you feel that?
[141] Definitely.
[142] And I mean, you'll be labeled alt -right now.
[143] I've already been labeled all right.
[144] It doesn't matter how many left -wing positions I support.
[145] I look alt -right.
[146] Yeah.
[147] Which is obviously, like, sexist.
[148] Yeah, for sure, misogynist, racist, all that stuff.
[149] Yeah.
[150] Well, but I mean, just labeling us because we're white men or something, a certain label, because that it's...
[151] Yeah, it's prejudice.
[152] I mean, it really is.
[153] But people don't mind prejudices in that regard.
[154] They have an issue with prejudices when it comes to what they feel like are disenfranchised or, you know, marginalized people.
[155] But white people, fuck them.
[156] You know, that's the thought process, right?
[157] You can't be racist towards white people.
[158] So, like, what are the most egregious things?
[159] Like one of the most ridiculous things they were trying to push when you were at these classes or meetings.
[160] So, I mean, besides the fact of just certain things in our hiring process that would favor certain people, which would create negative stereotypes for people just in general.
[161] So, like, one thing about stereotypes that they don't realize is that, you know, people will automatically create stereotypes no matter what.
[162] And it's based on their environment.
[163] So, and we see this with affirmative action.
[164] too in academia where if you create a sort of situation where portions of population are performing differently, then you'll automatically create the stereotype that, oh, maybe all the Asians are smart and all of the other minorities aren't as smart in this college, right?
[165] Because you need a $1 ,600 to get in if you're Asian and you need lower otherwise.
[166] And so you'll automatically create that stereotype and that's negative for everyone because you know it creates this tension between the groups and they self -segregate because of that while if you just put everyone in the same level then they'll just intermingle and it'll be great and so you know that has its negative consequences and it may be illegal which is what i was trying to say in my document so that aspect I think is bad, but then also, you know, once you think that, oh, all of this is because of sexism, and even though we can't really see overt signs of sexism, like, oh, yeah, you're a woman, therefore you're bad, and no one is saying these sexist slurs or anything, then it must be some low -level bias that we all have.
[167] And that's why they're pushing all this unconscious bias and microaggressions and just increasing everyone's sensitivity to oh you said something that could be interpreted in this one weird way and that might offend someone somewhere therefore you should never say anything and it's really stifling well i think we would all agree that we would all be better off if we treated people nicer you know if we if we didn't have racism if we didn't have sexism we just appreciated people for their qualities and just could be very objective about that.
[168] But when you're, I would imagine that when you're running a company as large as something like Google, you kind of have to put fires out before you even see smoke.
[169] Right.
[170] And like the writing's on the wall when it comes to criticism today.
[171] And anything that people can point to when it, you know, whether it's a percentage of women, a percentage of minorities, whatever it is, whether they feel.
[172] like is off.
[173] I mean, people will write articles about this.
[174] It can damage your stock profile.
[175] Yeah.
[176] I mean, it can, like, companies can take a hit on the stock market because of an article that someone could write about a lack of diversity.
[177] Like, oh, geez, they're a lack of diversity.
[178] Like, that's a real issue.
[179] Yeah.
[180] And there have been reports of companies that'll have these diversity programs and then blackmail companies if they don't take them.
[181] So, say, you know, they'll start complaining because, you know, all of these companies are the same and that they have about, you know, 20 to 30 % women.
[182] So they could do the same attack against anyone.
[183] And so they blackmail a company say, oh, you need to do these certain programs.
[184] And if you don't, then we'll start doing external pressure on you.
[185] So who are the companies that are blackmailing them?
[186] Yes.
[187] Or the groups.
[188] At least from what I've heard, and this is all secondhand, it's a lot of the programs that, so they'll hire contractors to perform some of the diversity.
[189] programs.
[190] Oh, so they have sort of like Jesse Jackson used to do with the Rainbow Coalition.
[191] Do you know the story behind that?
[192] A little bit.
[193] This is the second hand story, but the second hand story was that he would go into these groups and if anybody had said something, whatever reason they had to get into this company, they would go into this company and then they would charge them a tremendous amount of money to go in and create these diversity programs.
[194] And if they didn't do that, then they would shame the company, and they would claim the company was racist.
[195] Yeah.
[196] You know, and Jesse Jackson had, like, this laundry list of things he wanted, like, jumbo shrimp cocktail and all this crazy shit and limo rides.
[197] Like, really, like, it's been kind of documented.
[198] You know, I mean, I'd have to go back over it again.
[199] I remember it only barely.
[200] But the idea, I mean, that's where he got that moniker, race pimp that what he was essentially doing was race pimping.
[201] And then he was going around and, you know, kind of threatening people.
[202] that we will call you a racist, we call your company racist, comply in this manner.
[203] And that, that's scary.
[204] Yeah.
[205] I mean, I see that a lot at Google, not necessarily, you know, the same threatening, but just people feel that they have to walk on eggshells.
[206] Otherwise, they'll get reported to HR by some random activist within the company.
[207] You have activists in the company.
[208] Yeah, and, you know, that was sort of made public, you know, with all of this.
[209] where there were some people that just really pushed and started complaining a ton based on my document.
[210] They would email my HR, everyone up my management chain, and they'd write all these posts and try to coordinate people to really shame me. And then they started tweeting about it after, and that's how it leaked externally.
[211] Ooh, what was the criticism of the memo?
[212] Like, did anything make sense?
[213] Did anything make you go, hmm, I could have worked.
[214] worded that better.
[215] Obviously, the neuroticism, I could have worded that differently.
[216] I mean, the fact that I didn't talk about all the biases that are against women as much, but it was really that this was a Google Internal document.
[217] And so we already have so much stuff about the potential biases against women.
[218] And this was just the other side of the story, the other perspective that wasn't being heard.
[219] so yeah i i i don't really know any criticism that was really oh yeah that was definitely i should have done that man um so how you you would put this memo out there and then the memo got leaked and then once it got leaked you got fired yeah soon after but they knew about the memo already right so and they were cool with it like how long had the memo been floating around about a month wow so as soon as it went public they're like yikes get rid of them yeah wow it was it seemed to just be a PR thing of course yeah but it's also uh it's weak you know it's like it's really disturbing that someone couldn't look at this for what it really is like this is an opportunity to have a discussion about the subject right you know i mean here's this very deep detailed thing.
[220] If you guys disagree with it, let's debate it.
[221] Let's talk about it.
[222] Like I said, the only thing that I thought was even remotely derogatory was that one word or that one idea that women are more prone to neuroticism.
[223] Other than that, it just seemed to me to be evolutionary psychology.
[224] It seemed to be like a lot of stuff that has already been really well researched.
[225] This is some pretty clear differences.
[226] And again, it's not all women or all men.
[227] But there's a tremendous amount of evidence that shows that may, males lean towards certain professions, and females lead towards other professions.
[228] Yeah, and these are based on surveys of like half a million people.
[229] So, I mean, people are saying, oh, yeah, this is just one study that showed this.
[230] Like, no, it's many different studies across many different countries.
[231] And, you know, there have been even experiments that link this to just prenatal testosterone.
[232] Yeah.
[233] Which is pretty strong evidence that there's some biological link.
[234] Well, also, if you have a company like Google, which, by the way, before, we go any further.
[235] I'm a big fan of Google.
[236] I use their products all the time.
[237] I have a Google phone.
[238] I mean, I think they're amazing.
[239] I think their browser is excellent.
[240] I use Chrome.
[241] I think they kick ass.
[242] Every morning, I go to my phone and I check the Google news.
[243] I have a whole setup, but that's like one of the first things I do.
[244] I check the news on my phone from Google.
[245] So it's not like I'm an anti -Google person, but if there wasn't some sort of evolutionary reason or some sort of a prenatal testosterone reason or some biological reason why people were inclined to choose one profession over another.
[246] Google would have to be a fucking horrible company.
[247] If everything was even, if everybody was 50 -50 and they're only hiring 20 % women, that means they're monsters.
[248] Yeah.
[249] That means they're suppressing 30 % of the one.
[250] They're just like, fuck you, you can't work here.
[251] You can't get hired.
[252] You're just as good as us, but fuck off.
[253] This is a man's club.
[254] They would have to be monsters.
[255] Yeah, and that's why I feel like some people are shaming me, like, oh, this is such a bad thing to tell little girls that are interested in technology.
[256] When really, I think this is a much better view of the world where just, you know, yeah, if you're interested in technology, great.
[257] There aren't as many women like you, but if you are, that's amazing.
[258] While the other side of the story is, oh, no, even if you are, then you'll face all these challenges and it'll just be an uphill battle against sexism.
[259] and you'll never be seen as good as a man and you know that's not very encouraging to a lot of people well it's also not it's not necessarily accurate I mean you're kind of like bending the truth to meet your narrative you know where instead we should maybe look at like what are the differences between men and women but that's the thing like people don't want to even accept there's a trend today to not accept biological differences between the sexes which is just fucking bananas like let's just not Let's not accept the fact that water gets you wet.
[260] It's just weird.
[261] It's weird when people ignore truth to fit their ideology.
[262] And when you're looking at just sheer numbers of people, all you have is these numbers.
[263] I mean, you can have a bunch of reasons why.
[264] But to say that the only reasons are implicit biases, that the only reason is some sort of discrimination against women.
[265] That's the only reason why they're not 50 -50.
[266] That's crazy.
[267] that means we're monsters, right?
[268] I mean, doesn't it mean we're monsters?
[269] That means all men are monsters.
[270] Yeah, when it's often the exact opposite.
[271] You know, we're very welcoming of women.
[272] We really want every woman that we can get.
[273] And, you know, they'll even twist these studies that they have where they'll do these large analyses of, oh, why did you leave tech?
[274] And it'll be broken down by men and women.
[275] And it'll show, oh, 30 % of women felt like there was unfair treatment.
[276] and harassment and then one in ten women felt like there was undue sexual attention to them and then the media will just report on that but they don't see that 40 % of men compared to 30 % of women felt like there was unfair treatment and harassment and then one in 12 men felt like there was you know unwanted sexual attention so you know they completely disregard the other side of the narrative that you know it's not really a gender issue there's just unfair treatment in general, you know?
[277] Well, I think men are gross and I wouldn't want to work with them in an office.
[278] I mean, if I was a woman, I would think that would be the worst place to work because in an office with men, especially if I was attractive and it was just around a bunch of goons or staring at my butt and just saying stupid shit, men are gross.
[279] I mean, I think, like, in general, there's an issue with men and women working together because a lot of men are gross.
[280] You know, I mean, it's not all of us, obviously.
[281] But, I mean, just if I want to be honest about it, I would say that, man, I think women probably have to deal with a lot of shit.
[282] But is that the reason why only 20 % of them are in tech?
[283] Because that's not the case with all jobs when women work together.
[284] And I think men are gross across the board.
[285] They're not just gross in tech.
[286] I mean, they're probably gross.
[287] Like, what are jobs where women are disproportionately represented, like, on the other side?
[288] Like, is it like health care probably?
[289] Yeah, so nursing, veterinarians, schools.
[290] So a lot of things that deal with people or animals in this case.
[291] Yeah.
[292] Well, I bet they deal with gross dudes there too.
[293] You know, there's a lot of gross dudes.
[294] But that doesn't stop them from being hired at a disproportionately favorable number percentage.
[295] Yeah, it's we got to look, I think collectively.
[296] Here's one good thing.
[297] Here's another good thing about Google, because I don't want to trash on Google.
[298] And the good thing about tech companies in general, I feel like we are in a way better position.
[299] that tech companies are leaning way left.
[300] I think we're in a way better position socially that tech companies are being extremely concerned about diversity because you just don't feel that in a lot of companies where they're about the hard line.
[301] They're about the bottom line, making money, kicking ass, taking names, pushing the company ahead, and they're about infinite growth.
[302] This is not what I see from tech companies.
[303] What I see from tech companies is extreme caution when it comes to social issues and this extreme desire to be thought of as being very diverse, very fair, very liberal.
[304] I think that's good.
[305] I really do.
[306] I think it balances it out.
[307] And I also think when I think at least about the smartest people in the world or the most innovative people in the world today, I almost always think about tech.
[308] Because I think about like, well, if you looked at the human organism, you look at the human organism, the human species as a whole.
[309] And if you looked at, like, what does it do that's most impressive?
[310] Well, what it does is his constant innovation and this constant desire to make things more and more efficient, faster, more capable, that's a big part of that is tech.
[311] So the people that are, like, in a lot of ways, at least the most technically creative, those people are oftentimes very left wing and very liberal.
[312] So I like the fact that Google has this as a thought process.
[313] I just wish that it was unbiased in its determinations when it comes to biases.
[314] Does that make sense?
[315] Yeah, so I agree that, you know, being progressive isn't necessarily a bad thing.
[316] Right.
[317] And, you know, it is great that Google has this don't be evil motto.
[318] And, you know, they've decided, oh, yeah, we get a ton of ad revenue, therefore we can do a ton of random stuff.
[319] That's good for the world in general.
[320] But, you know, I think, unfortunately, their political bias has created you know they've they haven't forgotten their don't be evil motto it's just that don't be evil has turned into just don't disagree with us and what our ideology says they got sloppy yeah yeah well they're just a little off but they're going the right way you know and look it's very difficult to fucking I mean how could you run a giant company like that and be just totally cool and above ground and have it all worked out.
[321] I mean, it just doesn't happen, you know, and especially when you have all these internal influences, like you're talking about these activists that work, that have, they have a vested interest in proving that there's racism.
[322] There's a vested interest.
[323] Like, when you go looking for, you know, if you have a hammer, everything becomes an nail, right?
[324] If you're a person who's the type of person that's looking for racism everywhere, fuck, man, you're going to find it in all these weird places that don't even make sense.
[325] like these hidden unconscious biases where you have to examine yourself don't just look at overt actions and see whether or not those actions are racism you have racist you have to actually examine all your thoughts and try to find racist thoughts and because they are in there whether you want to believe it or not like right oh Jesus this is a goddamn ghost hunt you know it's a witch hunt it's like it's it's it's like I like again even though I'm a white man I really feel like it's leaning better that we're shitting on white men than, you know, if it was the other way.
[326] If we were shitting on minorities, I mean, it would be very disturbing.
[327] If an enormous company like Google was going, well, let's just be honest, Puerto Ricans are lazy.
[328] You know, like, whoa!
[329] But if a company comes along like Google and it's like, you know, white, you can't be racist towards white people, like, okay, like, at least we can work here.
[330] We could talk.
[331] We could talk about this.
[332] But this is, you're saying something fucking crazy and racist.
[333] I know you don't think it's crazy and racist because you're trying so hard to not be racist towards minorities that you're looking at what's a temporary majority.
[334] I mean, white people are only a majority for another decade, right?
[335] Yeah.
[336] You know, I hope it evens out.
[337] But I feel like in defense of Google, it's better to be leaning incorrectly in that direction than to go the other way.
[338] Yeah, I mean, I think it's fine to have a leaning.
[339] It's just you need to not be blind to the other side.
[340] And I think that that's what's happening right now where, you know, they're completely shutting down the conversation.
[341] And they're really making certain employees feel completely alienated.
[342] Well, yeah, it seems like you can't, obviously, you try to talk about it, and you were fired.
[343] Yeah.
[344] You know, I mean, you were, you were shamed for a little while, and then it went public, and then you were fired.
[345] Now, why did they do it, did they send it publicly because they knew that people would have a negative reaction towards it?
[346] That's what I think.
[347] Yeah.
[348] Do you know who did it?
[349] It was probably the people that were tweeting about it and saying that I was just a misogynist Nazi person.
[350] I don't know.
[351] Yeah, I mean, Nazi was definitely used, white supremacist.
[352] Nazi was used?
[353] Yeah.
[354] They just keep escalating.
[355] And at some point, I don't really know it'll happen.
[356] You know, white supremacist is now being used for a lot of things.
[357] You're a white supremacist.
[358] Somehow.
[359] Wow.
[360] And at some point, people will just see, no, these people aren't actually that.
[361] and they've just created a bubble of words that they say and it just keeps getting more and more extreme and at some point it'll just shatter like an economic bubble but I but that's very dangerous because it opens a door to competition to Google like someone who's more rational and I think that's unfortunate for Google to like to be supporting these ridiculous ideas like I read this one article where this woman was calling you a misogynist and it was like she was being really brutal, you know, like the, but it was, it was a total false narrative because I was listening, I was reading it, and I was trying to, like, I'd read your memo.
[362] So I read your memo and then I read this article about your memo.
[363] I'm like, this is like an angry person that has just decided that this is the focus of all the woes of the world is James and I'm going to shit on James and that the misogynist of the world like James are the reason why, which is James, which is the, you know, women can't excel in tech.
[364] Yeah, and I think part of it is that there's just an asymmetry.
[365] So there's no punishment for writing this really angry letter that says how misogynist I am.
[366] Yeah.
[367] Even though, you know, that's negative to me and anyone else that has similar viewpoints.
[368] So there really needs to be some sort of retribution maybe for people that just so openly are so negative about it.
[369] You could just get away with it.
[370] Yeah.
[371] And then no one questions it.
[372] It's not open for debate.
[373] That's really part of the problem.
[374] It's like people are so looking for things to be racist that when someone cries racism, if you debate it at all, like, well, how is he racist?
[375] You're a Nazi too?
[376] Like, you become a Nazi for like discussing things.
[377] Even if you, I mean, even if you just objectively go over the facts and don't agree with their assessment, you become a racist.
[378] Yeah.
[379] And, you know, even if you don't say anything that's overtly racist, they'll say, oh, yeah, that's just dog whistling.
[380] And, oh, yeah, you can tell what he meant when he said this.
[381] You could see it in his eyes.
[382] Yeah.
[383] This is not hyperbole, what I'm going to say.
[384] But this is real.
[385] This is how McCarthyism got started.
[386] Right.
[387] This is how it got started.
[388] Everyone was looking for communists.
[389] And you couldn't even explore what communism was.
[390] Like, you couldn't be confused.
[391] Like, if I read a book today, like I've got a book over there by Michael Malice on North Korea.
[392] If I read a book on North Korea, like, well, what's going on in North Korea?
[393] People wouldn't be like, Joe Rogan's a North Korean supporter.
[394] He wants to move to North Korea.
[395] He wants us all to be under a communist dictatorship wrong by Kim Jong -moon.
[396] You wouldn't say that, right?
[397] Well, back then, you would.
[398] Back then, during the McCarthy era, like, if you started reading, like, communist newsletters or you started going to a meeting, like, what is this all about?
[399] You could get shamed, run out of Hollywood, and it was a giant.
[400] issue and people were ratting on people and they were doing it for the same reasons they did not want to be lumped in with this group so they would immediately turn people in they were turning in their neighbors we're like scary time where people were looking for the communists everyone is looking for the dirty red scare you know they were going to come and infiltrate our world it's very similar because it's it's a mindset this mindset of not looking at things objectively but having everything boxed into these very convenient packages and this is one of them that diversity is of the utmost importance and that anything that does not challenge that idea or anything that does not support that idea rather is racist yeah and that was sort of what I was trying to say when I said demoralized diversity because you know we've just put it on such a pedestal and we've stopped looking at the cost and benefits of it and we've just started looking for villains you know all the racists and we just want to punish those villains and and label anyone that disagrees with any of the precepts of diversity as some sort of evil person.
[401] Well, it's just a foolish approach, especially the approach of making Asian people get higher scores.
[402] That is so racist.
[403] Like, yeah, they study harder and do better.
[404] What's the reason?
[405] I don't know, but whatever the reason is, they do it.
[406] I mean, is it cultural, probably.
[407] Is it biological?
[408] I don't know.
[409] But whatever reason it is, the correct response to that is not make Asians get higher scores.
[410] That's fucking insane.
[411] That's super racist.
[412] You know?
[413] I mean, how racist is that?
[414] That's crazy.
[415] Like, why are they, I mean, and they're a minority, which is even weirder.
[416] But it's somehow or another that one is like, we let that one slide.
[417] Because we know they don't complain and they kick ass and they go and study hard.
[418] So for some reason, we like let that one slip.
[419] yeah and a lot of this has some really nefarious history where the beginnings of just you know we used to just have tests and then that would be how you got into harvard for example and whoever has the highest score would get in but then they saw oh there's too many jewish people getting in and so they started adding all this oh let's look at your extracurriculars and let's make it more subjective on who we get in and And that way they could discriminate against Jewish people, really.
[420] So that's how it started?
[421] Yeah.
[422] And this was like early 1900s.
[423] Wow.
[424] Wow.
[425] Well, yeah, there's another one.
[426] There's a disproportionate amount of European Jews that are Nobel Paras winners.
[427] Right.
[428] Why?
[429] Well, they're fucking smart.
[430] Like, what does that mean?
[431] Does that mean that we're prejudiced against Irish people?
[432] No. What does it mean?
[433] Well, whatever it means, the end result is what's significant.
[434] We're not stopping other people.
[435] from taking these tests, right?
[436] If you get a disproportionate amount, portion of amount of European Jews, there should be some sort of study, and there has been, but there should be some sort of studies as to what is it culturally, like what is it, what is the significance, like what has happened in the past that led this one group of people to be extraordinarily successful, extraordinarily successful in one area.
[437] Well, that's what we should study.
[438] We shouldn't try to keep Jewish people out.
[439] That's fucking insane.
[440] And it's racist.
[441] And I think Asian people are not complaining the same way other folks would, you know, with the same exact issue, you know?
[442] I mean, it's essentially a reverse affirmative action sort of a situation.
[443] Really weird.
[444] Yeah, it's unfortunate.
[445] I mean, especially since many of these are just first generation immigrants, they don't feel like they necessarily have the power to really stand up to some of this.
[446] do you have any very many Asian friends have you ever been around like really strict Asian households yeah I mean the culture is definitely different and there's a higher priority on school and more traditional values I had a good buddy of mine when I was young who was Korean and he was in medical school and his parents were brutal I mean they just wanted A's across the board no fucking excuses you will study until your hands bleed and you know there was just this sort of culture of success right in that household and of work hard work and work ethic and you know the the family's idea was like look we came over to america from south korea so that you could kick ass period you're not going to come over here and fuck off and obviously he was a fucking straight a student and just a wizard.
[447] I mean, this dude was just always awesome at everything and always working really hard, but he was completely stressed out all the time.
[448] Every time you'd see him, he was like, but just getting everything done.
[449] But, I mean, it's the culture that he grew up in.
[450] So to discriminate against that guy and say, well, you work too hard.
[451] Hey, Jungshik, you can't, you know, your scores are a little bit too high.
[452] We don't like it.
[453] So we're going to need a higher threshold for you.
[454] That's racist.
[455] Yeah.
[456] And there's nothing that they can do.
[457] I mean, you can't work harder.
[458] No, no, it's stupid.
[459] It's like saying to athletes, like certain athletes, oh, well, you know, you've been training too hard, and so we're going to need a faster 40 -yard dash from you than a regular person to get on the team.
[460] You would never say that.
[461] You would say, well, this guy is obviously super dedicated and gifted.
[462] This is the guy we want on our team.
[463] And that's the one thing where I feel like we don't see a lot of this stuff.
[464] You know, we like results when it comes to athletics.
[465] when it comes to things like what's your number like what is the fastest you can run what is how high can you jump what's the pole vault that you do you know how far do you throw a disk is all those things like very clear there's very clear numbers you can't you can't do that sort of same sort of approach that you're doing with academics or with industry you can't do that approach when it comes to athletics it's i mean i'm not you know suggesting that the whole world is a sport but i mean when when it comes to things things like scores and keeping people out and letting people in and trying to push, you know, to get more people of a certain color or ethnicity in.
[466] Like, you're, you know, you're doing some slippery work, man. You know, it gets real weird when you start doing that.
[467] Yeah.
[468] It's all about leveling the outcomes of people.
[469] And it, and there's this scary Kurt Vonnegut short story where, you know, if you're really smart, then you'll have to wear headphones that just beep all the time.
[470] If you're beautiful, you'll have to wear a mask in the future.
[471] If you're strong, you'll have to have all these weights on you.
[472] And, you know, it's sort of getting there.
[473] It's the same sort of ideology.
[474] And it's scary.
[475] Well, life is not fair.
[476] Right.
[477] It is just not.
[478] No one wants to hear that.
[479] And this is really the core issue for all of this stuff.
[480] Life is not fair.
[481] There are people that are so much fucking small.
[482] smarter than me, that when I talk to them, I feel like some sort of a monkey, you know?
[483] It's just, there's no getting around that.
[484] There are people that are so much bigger than me. When I stand next to them, I feel like a child.
[485] You know, there's just no getting around that.
[486] That's just the way of the world.
[487] And I think the key is, I mean, I guess with a company, is to try to figure out how to manage all of these unfair aspects of being a biological entity in a civilization.
[488] And I don't think Google's doing the right job by firing you for promoting science.
[489] Because that's what you're doing.
[490] You know, I had a friend who actually was comparing what you did to, what's that term phrenology when you study the size of people's heads and determine whether or not there's one.
[491] And I was like, man, you can't say that.
[492] It's not what he's doing.
[493] It's not what he's doing because he's not saying that women can't do it.
[494] He's not saying they wouldn't be better at it.
[495] He's simply using science and citations to describe many of the issues.
[496] that probably led to people choosing what they choose to do for a career.
[497] Right.
[498] But you can't do that, man. Look, look at you.
[499] You're here.
[500] You're everywhere.
[501] You're talking about this?
[502] Yeah.
[503] Hopefully people will start seeing that at least how much the media was misrepresenting it.
[504] Yeah.
[505] Did you feel frustrated by all these articles?
[506] I mean, it's got to be weird to have people call you a white supremacist and a Nazi.
[507] Yeah.
[508] And they also try to dig up any dirt that they can find on my history.
[509] and like stuff way back in high school that I might have done.
[510] I heard you playing Tomb Raider.
[511] He played as Laura Croft.
[512] He was a girl running around big tits.
[513] Do they find anything?
[514] Not really.
[515] Damn, dude.
[516] What if you had like some dark secret?
[517] Yeah, that's the thing.
[518] Like I could have done some random thing that was bad.
[519] Of course.
[520] But I mean, that wouldn't change the fact that what I wrote wasn't this sexist thing.
[521] Right.
[522] Right.
[523] If you did something horrible in the past, at least people could go, oh, okay, maybe this guy's a bullshit artist, and he leaned this stuff towards sexism, even though there is some science behind it.
[524] What he wrote was biased, but I haven't seen a legit criticism of the actual work itself.
[525] I really haven't.
[526] I've read a lot of stuff on you, man. It's a little creepy.
[527] I haven't seen anything that made sense.
[528] Everything that was criticizing you was being really dishonest.
[529] Yeah, it was either just, oh yeah, this is obviously misogynist, or they would attack claims that I didn't make.
[530] They were like, oh, yeah, we've shown that women are better in school and are doing better in math.
[531] Yeah.
[532] It's like, okay, I wasn't talking about that at all.
[533] Yeah, it does not, that has nothing to do with career paths.
[534] You know, it was really fascinating to me that the woman who's the CEO of YouTube responded and said it hurt her when she read your memo.
[535] I'm like, you're the fucking CEO of YouTube.
[536] You won.
[537] You're the winner of winners when it comes to YouTube.
[538] Like, you're at the head attack.
[539] Like, no one's saying you don't exist.
[540] No one saying you can't do it you obviously did it You're fucking running the thing It's just crazy why did it hurt Science hurts like what hurts Like look there are There's there's outliers Right there's always going to be It's it's interesting to find out why Straight white males Choose different career paths Like why I mean there's so much variation There's so many variables There's so much difference There's people that are You know there's women that are MMA fighters Like why?
[541] Why are they doing that?
[542] Like, what is it?
[543] I don't know.
[544] Like, there's women that are race car drivers.
[545] There's, like, there's outliers.
[546] Does that mean that we need an exact representation of males to females in MMA?
[547] Well, that's insane.
[548] That's not going to happen for whatever reason.
[549] Is that the same, the NASCAR, it's not going to happen.
[550] There's not some implicit bias that's keeping women from driving 250 miles an hour.
[551] I don't know what it is, but I don't think that's it.
[552] I think there's probably some budget.
[553] biological differences between men and women, and they vary.
[554] There's a spectrum.
[555] Yeah, I mean, for NASCAR, it's likely, you know, risk aversion.
[556] Yeah.
[557] And some stuff related to that.
[558] Same with MMA, I'm sure.
[559] Yeah.
[560] I mean, I wouldn't want to do that.
[561] Yeah.
[562] I mean, look, dude, I work in and I want to do it.
[563] You know, and I know a lot of pretty girls that are doing it.
[564] It's very weird.
[565] So it's, look, there's people make choices, you know.
[566] Some people choose to get their bodies tattooed.
[567] Some people choose to do all sorts of strange things.
[568] You know, I don't know why they do what they do, but it's interesting to study them.
[569] And it seems to me that all you were doing was talking about your own personal frustration with this very narrow -minded approach to diversity.
[570] Right.
[571] And, you know, they never even say what exactly I could have done differently to not do this.
[572] And, you know, there was actually a great piece in The Atlantic or something that was, directed at Sundar, it's like, okay, what specific parts of the document were against the code of conduct?
[573] And what parts are free to discuss and what or not?
[574] Because right now, you can't discuss anything.
[575] You know, he just said, oh, yeah, this document is invalid.
[576] And so it means that no one can bring up any of these issues now.
[577] And they just have to walk on these really vague egg shells when really if they said, no, this specific part is unacceptable, everything else is fine.
[578] Then at least there would be some wiggle room and people would know what the rules are.
[579] And we see this a lot with Google policy where they had these like no -jerk policies and No jerk?
[580] Yeah, like don't be a jerk.
[581] Okay.
[582] And where jerk is totally up to them to define.
[583] Yeah.
[584] And so there could be these people that just, you know, harass you based on your white male privilege and, you know, oh, you're a conservative, therefore you're evil.
[585] And that's not being a jerk.
[586] but then, you know, questioning some of their viewpoints and, like, the narrative at Google, that's being a jerk.
[587] So white people are open game, essentially.
[588] Like, if someone is questioning you about something and you happen to be a white person, they're going to get away with far more?
[589] Yeah, they try to invoke this a lot, too, in these programs where you're encouraged when you ask a question or something, you say, as a white male, this is, what I feel.
[590] Oh, Jesus.
[591] And that, I just think that that's really going down the wrong.
[592] She get real super specific.
[593] As a white male with a fat dick and a large pornography collection on a hard drive, this is how I feel.
[594] Like, like, what?
[595] No, I think even mentioning pornography would be some sort of microaggression.
[596] That's a major aggression, I would assume, right?
[597] Sexual harassment.
[598] Yeah, you were, they cited that you promoted harmful gender stereotypes.
[599] Right.
[600] So I had to over it again.
[601] I'm like, okay, let's read this fucking thing one more time.
[602] Like, I don't, I don't think you promoted any stereotypes.
[603] You were talking with citations about science.
[604] And that's, that's where this whole thing really confused the shit out of me. Have you had many people, like, what is the, has there been like a 50 -50 sort of reaction?
[605] Like 50 % of the people were like me kind of confused about this and then 50 % of the people were just knee -jerk calling you some sort of a sexist or a Nazi?
[606] Yeah, so at Google, they had an internal poll with about 800 people, and about 40 % of people agreed.
[607] 40 %?
[608] Yeah, 50 % disagreed.
[609] 10 % were neutral.
[610] Yeah, cowards.
[611] And even the 50%, probably a good percentage of them were just being pussies.
[612] It just doesn't seem like if you're looking at it really objectively, you could look, they obviously want a result, and that result is the maximum amount of diversity and I feel like if that's your result if that's what you're looking for shouldn't the result be let's just not discriminate just be open and just try to get the best people like wouldn't that be the best way to do it and then if we run into problems like you know we've tried to do this best people thing but all we have is Asians so yeah so even suggesting that we should go to some meritocracy thing yeah that's a microaggression meritocracy is a microaggression.
[613] Yeah.
[614] Yeah.
[615] So there, what's the argument for that?
[616] Like, why is meritocracy a microaggression?
[617] Because it'll make some people feel unwelcome.
[618] Because they have to perform?
[619] It's basically just anything against the left's ideology is a microaggression in some ways.
[620] So anything that could make anyone feel offended, particularly people in certain groups.
[621] man i've been liberal for a long time and i've never seen it this bad before i don't know what happened when it happened when did it get so slippery it seems like in the last 10 years right yeah i think the internet has accelerated a lot of this where there can be these online mobs that enforce these social rules yeah but i i think at least now some people can see it for what it is Well, I think what you're seeing is that there's a fear of retribution, and that's one of the reasons why people are towing the line, is that they're worried about these, like, hyperaggressive people that are coming out against people that don't tow the line.
[622] They're, you know, like you're saying, shaming you.
[623] And that is a, that's a disturbing aspect of human nature that I don't think should ever be reinforced.
[624] And, and I think it's hard to call those things out individually, because of, collectively as a group, if this group of, you know, diversity -minded folks, left -wing -minded social justice warrior types are attacking you, you feel very isolated, and there's not a lot of support.
[625] Yeah.
[626] You know, and so most people just acquiesce, they just back off, they just give in, they tow the line, they just alter their thoughts, or they keep it to themselves.
[627] Yeah, I mean, I think shaming does have its benefits.
[628] Sure.
[629] And when we were in small groups, you know, if someone stole something, or it was mean, then actually shaming them is good.
[630] But you're talking about tribal groups back in the past, yeah.
[631] But now that anyone across the world can just randomly shame you and attack you, that's not really what our brain was meant for.
[632] Right.
[633] Just, you know, even me seeing just random messages telling me that I'm some horrible person, that, you know, that hurts me. Yeah.
[634] Even though I have gotten a lot of actual private messages saying, yeah, we support you.
[635] you're not alone.
[636] But I have to keep my mouth shut because I don't want to get fired from Yahoo. Right.
[637] I've met with so many people and they're like, and of course, you know, don't tell anyone that I met you.
[638] That's so weird.
[639] Now, you obviously were not a public person.
[640] Right.
[641] You were a guy just was working.
[642] What is your job at Google?
[643] Software engineer.
[644] I was working on the indexing and serving of search.
[645] so to go from that which is like you describe yourself as an introvert right and to go from that to this massive exposure and to be the essentially the lightning rod for a real hot topic I mean this is one of the most hot button topics you can get men versus women in tech or men and women in tech women diversity white people black people racism Nazis you're like you're like you're a at the fucking tip of the spear buddy yeah and i'm really afraid that i'm actually just you know polarizing the issue even more and separating people and because you know it's really shown that there the stereotypes are real in some ways that there are some really extreme people on the left and really extreme people on the right maybe yeah and you know we really need to bridge it and say okay let's actually have a discussion, let's talk about what's actually happening, and nothing is really off the table in this discussion, but that's not happening.
[646] And Google itself, from what I've heard, they've just been doubling down on the diversity stuff, and they haven't addressed any of the political discrimination.
[647] Wow.
[648] Well, I think you're right, and I think that's, that has to be your motivation for writing that thing.
[649] I mean, that was a very well -thought -out memo.
[650] And I don't think someone who wanted to separate people would have written that.
[651] The way it seemed to me as an outsider with no dog in the fight, I was looking, I was like, oh, this guy is probably like frustrated at what he sees these sort of social justice warrior tactics and these aren't logical and that this is not rational and like maybe my breakdown of this situation scientifically, evolutionary psychology studies and all these different random factors that may have contributed to women choosing these careers.
[652] Maybe this will, like, help ease off.
[653] Maybe people aren't aware of this information.
[654] Yeah.
[655] Yeah, and I definitely have a bias where I thought, you know, we could just sit down and discuss it rationally.
[656] That's all I ever wanted was sit down and discuss it with them.
[657] Yeah.
[658] But I really underestimated the sort of group -based emotions that were behind this.
[659] And that's scary.
[660] Yeah.
[661] Yeah.
[662] Well, you know, they're fucking, one of the things that was important about Charlotte, I think, Charlottesville, rather, is that we got to see real Nazis.
[663] Like, hey, man, they're real.
[664] It's not the fucking guy writing the Google memo.
[665] It's this asshole with a swastika on his check.
[666] He's carrying a tiki torch walking on the street with a gun in his pocket, like, you know, ranting about the Jews and black people.
[667] That's a real Nazi.
[668] and that is what you were saying.
[669] There's extreme people on the right and there's extreme people on the left.
[670] And they don't understand that they're way more similar than they like to believe.
[671] If you believe that all white people are racist, if you believe that it's impossible for you to be a white person and not have some implicit bias and some racism and that it's impossible to be racist against white people because racism is about using power.
[672] in influence and minorities do not have power and influence so they can be prejudiced but they cannot be racist.
[673] Well, you're just as bad as a fucking person with the tiki torch.
[674] You don't think you are.
[675] I know you don't think you are, but you are because you're just as ridiculous.
[676] You're so off of what is real.
[677] You're so off.
[678] You know, the idea that all black people are responsible for the woes of society and that none of it has to do with the fact they were captured hundreds of years ago and brought over here as slaves and that they're lesser as human beings.
[679] That's a disgusting, ridiculous proposition, and the people that think that way are fools, right?
[680] And that rightly so, most people in the center look at those as fools.
[681] I look at the people that think that you can't be racist against white people as just as foolish.
[682] You dumb fucks are fueling these assholes.
[683] Like with this dumb way of looking at things and pushing these ridiculous ideas that all white people are racist, you're supposed to feel bad because you're white.
[684] I didn't do anything.
[685] I didn't do anything.
[686] didn't have to be born white didn't ask to be born male okay like you can't get mad at people for who they are yeah you you you should we should be having an open discussion about what is wrong like what's wrong like how what what what what is going wrong why is this happening why are these negative things happening not why don't we have more women or why don't we have more indian men or why don't we I mean that's that's fucking ridiculous it's ridiculous Yeah, it's crazy and the two sides are just scapegoating, so it is very similar and not taking any personal responsibility, you know, at least, you know, what Jordan Peterson would say is just, you know, fix yourself before trying to fix the ills of the world.
[687] Well, I mean, I think there's also an issue here is that I don't, I've got to be very careful with my words, but I feel like this is a game.
[688] And I don't mean it's a game like there's not, it's not a real issue.
[689] It's absolutely a real issue, but I think people play for points.
[690] And I think that there's a real issue when people do things for social brownie points.
[691] Like Google's saying that you were fired for promoting unfair gender stereotypes or dangerous or what was the word that they use?
[692] Harmful.
[693] Harmful.
[694] Harmful gender stereotypes.
[695] That is a fucking play.
[696] That's a play for points.
[697] 100%.
[698] Okay.
[699] Where are the fucking harmful?
[700] gender stereotypes.
[701] Where are they promoted?
[702] You tell me how.
[703] If you don't tell me how I want a fucking apology, because you're lying.
[704] You're lying because you want all those people on the left to calm down.
[705] Well, we fired him.
[706] Oh, you fired him a month after you knew he wrote that shit.
[707] Are you guys crazy?
[708] Did you go over the science before you fired him or no?
[709] Like, what did you do?
[710] Like, where's the harmful gender stereotypes you guys talked about?
[711] Yeah.
[712] I mean, it sucks too because, you know, you really need to address some of these things if you want to address the gender gap.
[713] Yeah.
[714] And, you know, that was what, you know, page of my thing was all about was, oh, you know, if women are more cooperative and they approach the workplace differently, then maybe we can change the workplace to be more approachable.
[715] But if they're not willing to acknowledge any of these differences, then, you know, they won't do anything.
[716] So it's really annoying.
[717] Well, any interpersonal relationships with random people can be messy.
[718] You know, you get a group of 30 people together, you force them to work in a building, and it's going to be messy.
[719] People are messy.
[720] We're weird, you know, and if you have more of one group than another, that group is going to fail alienated.
[721] So if you have 80 % men and 20 % women, they're going to feel alienated.
[722] There's no way around it.
[723] But the right way of approaching it is not to distort the facts, especially when you're thought of as being.
[724] I mean, Google is essentially a pillar of information.
[725] I mean, they're one of the most important, like, hey, man, Google it.
[726] I mean, that is the thing that people say.
[727] They're one of the most important aspects of our society today.
[728] Having the ability to instant, nobody says go Bing that.
[729] Nobody gives a shit about Bing, right?
[730] I mean, Bing is a joke.
[731] But Google is hugely important.
[732] So if you are essentially in charge of the distribution, of more knowledge than arguably anything else on earth.
[733] I mean, that's a big statement, but I think you might be able to, you might be able to actually say that and be pretty honest.
[734] I think Google is responsible for distributing more information than any group on earth.
[735] Right.
[736] That's a giant responsibility.
[737] And in that responsibility, you cannot say that someone is promoting harmful gender stereotypes when they're absolutely not.
[738] Because I'm going over this fucking thing.
[739] I'm pulling pages on.
[740] I'm like, where's the harmful gender stereotypes?
[741] Other than the word neuroticism, I just don't.
[742] If you got fired for the word neuroticism, well, why is that word in all these evolutionary psychology texts?
[743] Like, what's going on?
[744] Yeah, and if you just Google personality differences between men and women or something, then that'll be the first five results.
[745] So it's...
[746] Yeah.
[747] And obviously, these are just some of them.
[748] I mean, there's a broad, again, there's a broad spectrum of human beings and both genders.
[749] Right.
[750] Are you still in them?
[751] Yeah, exploring all legal remedies.
[752] Have they contacted you and go, listen, James, James.
[753] We don't have to be so crazy, James.
[754] Let's just relax, James.
[755] Let's go to dinner.
[756] Let's have some falafel.
[757] I am surprised that they never, you know, when they fired me, had me try to sign something to say, oh, yeah, you know, just here's some.
[758] non -disclosure agreement or something and then just pay you off yeah yeah but wow that was a big fuck up on their part it seems like it well i think they feel like they're completely i feel like the game again is like super clear like oh no we just we sunk a three pointer in it's no problem like this is pretty this pretty straightforward who went in the net dude we don't have to do shit yeah you have to pay him because he lost the point you know and i think maybe they underestimated how much negative press there would be about this because a lot of the initial stuff was all negative because it was coming out of the people that were tweeting about it and then they saw that oh yeah it's really not this one -sided and you know a lot of the things that may happen in a case is you know there's a lot of discovery into what the internals of google are happening i don't think they want that to happen because yeah we'll actually see that oh yes maybe there was this illegal discrimination happening Now, what is illegal about the discrimination that they're employing?
[759] So I, and I'm not a lawyer, so I can't say, but at least according to our own policies, we said, you know, it's illegal to use someone's protected status.
[760] So they're sex or a race in employment critical situations, like, you know, when they're getting hired, when they're trying to be matched to a manager or to a team, and when we're choosing who to promote.
[761] and but it is happening in a lot of these places protected status yeah that's how they refer to it internally or is that like a common phrase i think that's a common phrase wow protected status what's protected about yeah you're not supposed to be able to discriminate on based on someone's age or you know i mean it's mostly it was originally like oh yeah you shouldn't be discriminating against black people but obviously i mean it should apply to everyone Right.
[762] So by doing that, they've violated their own rules.
[763] Right.
[764] But they don't think about it that way because they're promoting diversity by doing that.
[765] Yeah.
[766] It's kind of weird how they cited some of the same parts of the code of conduct where, oh, yes, every employee should do their utmost of reducing bias and harassment and legal discrimination when really my document was.
[767] about illuminating the bias against conservatives and the harassment against them and the legal discrimination that we're doing in multiple parts of our pipeline.
[768] There's no room for conservatives today, sir.
[769] I mean, are you a conservative?
[770] Do you feel like you're conservative?
[771] I'm pretty much just libertarian.
[772] But that's thought of as conservative because it's convenient, right?
[773] You just immediately pushed off into that right -wing, angry white male group.
[774] Yeah, everyone that's in the center or right of that.
[775] is all right all right yeah yeah it's pretty so you you favor smaller government less intrusion yeah i i'm not super libertarian like i obviously believe that there's places where the government should be but just i do as well yeah my like internal leaning or in philosophy is more like the yeah i think socially i lean more left like socially like in terms of like welfare and things on those lines and you know obviously this um protected status is driving me crazy this this thing that trump's doing with um uh children that were born in this country or born in other countries and then brought over here as children and then they're talking about deporting them that drives me fucking crazy yeah that question the hard right version of that is despicable this these people that i see online why didn't they apply for citizenship oh who knows me because they're fucking 13 you know like will you out there applying for citizenship if you were 13 no i mean when you 13 years old, you're playing games and hanging out with your friends, and then you find out you were born in Guatemala, and you're like, what?
[776] And you have to go back to Guatemala.
[777] What?
[778] Yeah.
[779] It's crazy.
[780] It sucks.
[781] I lean way left when it comes to those kind of things, gay rights and things like, you know, social programs for disenfranchised people and disenfranchised communities.
[782] I lean way like, if I want my tax dollars to go to anything, I wanted to go to making people's lives easier, whether it's socialized medicine or whatever we could do to make people like have an easier path to success and to not have them so burdened down by their environment and their circumstances.
[783] That I think is like our responsibility as human beings to try to I don't want to say even the playing field because there's never going to be an even playing field but to give people opportunity.
[784] That's it.
[785] Just give people an opportunity to do well and not have it It's so completely stacked against them.
[786] So in that sense, I'm not very conservative in that way.
[787] Like, I'm not one of those pull yourself up by your bootstraps thing.
[788] Because that's just, that's so delusional.
[789] Like, some people are just fucked.
[790] You know, they're born with a terrible hand.
[791] Right.
[792] And it would be nice if more of us were charitable in that regard.
[793] You know, and some people think that that charity should be a personal issue and that we should all just do it, you know, as part of our community and our society.
[794] Maybe.
[795] That's a good argument.
[796] but maybe the argument is that our government should be a part of our community, you know, and that we should think about it that way, instead of thinking of as this overlord that decides and designates where our money should go, that maybe we should have some more say in it.
[797] It should be some sort of a more, you know, just a more kind approach.
[798] So in that sense, I lean pretty far left.
[799] But I'm also pretty pragmatic, you know, and I also know.
[800] And I also know that.
[801] that if you give people too much, it's like sort of that winning lottery ticket thing or that, you know, that if you make things too easy for people, they don't try hard.
[802] Right.
[803] It's just a natural part of human nature.
[804] So in that sense, I'm conservative in a lot of ways.
[805] Yeah, like you definitely need some sort of safety net and to ensure that, you know, people can't actually achieve the American dream.
[806] Well, just be healthy.
[807] I mean, I've been leaning more and more towards universal basic income.
[808] than anything.
[809] I think universal basic income at a certain point, like enough that you can just eat and survive and then maybe that would open up a lot more people to pursuing dreams, to going after things.
[810] I mean, I don't know.
[811] I mean, there's arguments for and against, and I think it's debatable.
[812] It'll be interesting to see, you know, Finland, I think, was proposing to start this because we don't really know what will happen.
[813] Right.
[814] And maybe people will, you know, start doing their hobbies and really find their passion maybe they'll just sit at home and watch tv and die yeah it's really these are the problems that we as a society will have to overcome and you know of course these are just first world problems but right that's that will be what the world is like there was another country today i read about it on google uh another country today that's considering universal basic income but fuck him was it south korea oh right really um see if you can find it it's it's Scotland is that what it was oh I think there's many many people that are I mean Elon Musk has been promoting this lately Scotland will begin funding universal basic income experiments yeah Hawaii that's what it was Hawaii considers universal basic income as robots seen stealing jobs fucking robots running on the street stealing jobs yeah it's Hawaii um I think there is some real arguments to be made and I think um Elon Musk who is of course a part of this automated car revolution right and he's uh he's creating these these trucks that they're going to start using to haul things and they're going to be automated and it's going to remove a lot of jobs and they're starting to talk about universal basic income is you know a real solution to that.
[815] that.
[816] I mean, it's entirely possible.
[817] It's certainly an argument.
[818] It's really worth discussing.
[819] Yeah, something like that.
[820] And hopefully the incentives will be better than some of the current welfare systems where, you know, you're not incentivized to get off of it.
[821] Yeah.
[822] If you start working, then you'll lose all of it while universal basic income can be made such that you start working and then you'll lose a little bit, but it's never an actual incentive to not work.
[823] Mm. Yes.
[824] Right.
[825] It's not an incentive to not work, but it's an, it's a, it gives you food and shelter.
[826] Right.
[827] So then you could go pursue a dream, which I think would be wonderful.
[828] I mean, look, if there's anything that our tax dollars should be going towards, it's creating less losers.
[829] Right.
[830] Less people who feel disenfranchised by the system.
[831] You know, if you're, if you're, if you're, if you're, if you can pay X amount of tax dollars, but live in an exponentially more safe and friendly and happy and environment.
[832] I think most people would be leaning towards that.
[833] I think it would be good.
[834] Yeah.
[835] And we'll see this, we see this too in people that start companies where it's a huge risk to start a company.
[836] Most people fail.
[837] And most entrepreneurs in Silicon Valley are men who are much more willing to take risks.
[838] But if we do have some sort of strong safety net, then it won't be so bad if you fail.
[839] And maybe that will help address some of the gender gap too.
[840] That's interesting.
[841] You know, it's we want women to succeed in these positions so badly that like a woman's CEO can become like a superstar like that lady from that blood testing company that turned out to be all bullshit.
[842] Was that Thanos?
[843] Is that the name of it?
[844] Theranos.
[845] That was a fascinating case.
[846] This woman essentially was role playing as a female Steve Jobs.
[847] with a bullshit product that didn't really do what it was advertised to do.
[848] And her company was valued at, you know, something like $30 -something billion.
[849] And she was thought to be the richest self -made woman in the world.
[850] And then almost overnight, she's worth nothing because they found that it doesn't work.
[851] And the company sort of fell apart.
[852] There it is.
[853] How Elizabeth Holmes House of Cards game came Tumbling Town.
[854] It is a fascinating story.
[855] Because this woman, look at her there.
[856] She dressed the part She put on a fucking black turtleneck I mean she dressed like Steve Jobs I remember she gave this speech once It's some woman's for you know Women's success group for something or another And she got up there And this like unprepared Rambling stupid speech And I was like how is this woman this super genius Well it turns out she wasn't You know she dropped out of college at 19 and created this company She started this like when she was in college you know and she she basically just fit what people were looking for you know and and bullshitted her way to billions almost you know to really kind of crazy yeah a lot of people are very willing to see that whatever narrative they want yeah and I mean we see this all the time in the media too where they just fit the data however they want right which is why they wanted to call you a misogynist yeah you know when I when I first read that I was like wow like this is a this is a hot take by this lady who she wrote the article I think she wrote one of them was uh let me lady explain what's going on with women in tech did you read that one i saw it i i remember there's so many yeah there's a lot of it dude there's a lot of it so where are you at right now i mean you you obviously did they give you some sort of a pension or something like that they didn't give you any money they just fire you no yeah i've been cheap throughout the years though so i saved up some cash yeah that's good still trying to figure out what's next have you gotten any job offers i've gotten random job offers from people but i haven't i it's hard to tell how serious they are have you thought about writing a book yeah i mean i'm not too much of a writer so you wrote that memo pretty well yeah i mean i'm famous for what i wrote but i'm not it's you know if i you know had been studying how to write and stuff my entire life i wouldn't have been an engineer right but we did a great job with it though yeah i mean it's very thorough i like to think so it it addressed a lot of things and it's unfortunate that there was that one part that is getting so much of attention when really it pointed out a lot of problems in our culture and a lot of suggestions for how to fix things and it seems like none of that is really getting traction yeah No, well, at least it started the conversation, right?
[857] Not at least for you, because you got fired.
[858] Yeah, I mean, in some ways, though, it has made it even more dangerous to bring these up.
[859] At least, you know, it's sort of empowered some people to at least understand some of the issues, and hopefully these things will get brought up.
[860] But right now it's sort of a toxic topic to bring up at Google it.
[861] least.
[862] Do you think that it's toxic in the short term, but in the long term, it'll inspire a more reasoned, balanced conversation once the dust is settled?
[863] Hopefully.
[864] And that's sort of one of the hopes with the lawsuit is to show people that, no, Google can't just do this, that there are limits to how much they can silence things.
[865] Yeah.
[866] And you shouldn't be afraid to point out issues in the workplace right and you just said with the lawsuit like it's absolutely happening yeah i mean we've we filed a claim with the nLRB which is the national labor relations board and so they usually work with unions and you know it's often employers that try to break up unions and fire people for joining unions and that's illegal and you know a lot of this what i was doing was a conservative effort between multiple people that, you know, trying to improve the workplace.
[867] Right.
[868] And actually, you know, whistleblow on some of the illegal practices.
[869] Did you save emails where people were shaming people for being white or shaming people for having implicit bias because they were white or harassing people for...
[870] Yeah.
[871] A lot of people have been doing this.
[872] Like, there's some underground efforts within Google to at least document some of this because you know while they may not be the majority they they they're sort of a silent coalition within google that's sort of upset about a lot of this oh that's interesting so there are some conservative people that work at google yeah i mean there's definitely more than zero more than zero um is it like 20 percent like the amount that are represented like women that are represented in the company so i mean it's It may be that or even lower.
[873] I think there's a lot of libertarians, so that would be the main counter to the extreme left.
[874] But, and then, like, so like what the main retributions against people are, are the social conservatives, and they feel completely alienated.
[875] So it's really unfortunate for them, but I, yeah, there's at least hundreds of them.
[876] Now, when you say, like, social conservatives, like, what, what do you, how do you classify that?
[877] Like, what would, I mean, I guess people that believe in traditional values and homophobes, say it.
[878] So I think this is a lot of what's happening, too, on where people just assume, okay, because you believe, say, in traditional values and you think that, you know, marriage is an important thing.
[879] and, you know, I think that there is evidence that, you know, bringing up people in a two -parent household, whether or not it's, you know, the same sex or different sex, that is important for children.
[880] And there's a huge disparity in outcome of people with only one parent versus two.
[881] So there is something to be said about marriage and, you know, having cultural norms that support that.
[882] But so, you know, just completely.
[883] alienating that side of the argument is really negative and that's hurt our society in general, I think.
[884] Yeah.
[885] Well, I think anytime you silence discussion based on your own personal ideas of what should and shouldn't be debated, I think it becomes an issue.
[886] I mean, you could disagree with someone.
[887] And that's a very complicated issue when it comes to whether or not two parents or more beneficial to a child than one because obviously there's a lot of reasons why people break up yeah you know you don't want to encourage people to be in toxic relationships and then show the child that you know this is the framework for a loving relationship people that scream at each other and whatever horrible shit they do to each other that gets super complicated and very very personal right yes it's definitely a touchy subject it's very personal one yeah so i i don't know personally how to address that but I think it's at least something that we should be cognizant of has anybody said when you know these white people are being shamed in this you know has anybody ever stepped up and say hey this is racist they might have but never in a public forum that I never but publicly white people have been criticized right and you know they there's all these negative stereotypes of men and white people and you know those gender stereotypes are fine but and you know the whole idea that you know I'm only here because of my white male privilege therefore I'm somehow a worse programmer than all the other non -white non -mails is that implied or is that stated I it's implied that you know they get it easier in life and in the interview process and in their evaluations yeah white and maybe Asian males so how is it implied though like can you give me an example and they'll say explicitly that just, yes, these groups of people are disadvantaged, these are advantaged, there's this privilege that they have, and we've seen it time and again through all these evaluation processes that they're better evaluated and these are worse, and they often just see whatever data that they want, you know, like the case before where they just pulled out the female side without seeing that, oh, the male side was pretty much the same.
[888] And it's crazy.
[889] And you even see it in some of their internal studies where, you know, they were trying to show how racist or sexist Google was and how worse women have it.
[890] So they were looking at the code review process where, you know, you can submit code to be reviewed and then someone has to approve it before it goes into the code base.
[891] And they were looking at, okay, if a woman's the author of it, how many comments do they get on this review?
[892] and you know if they got more comments then that would mean that their work is more scrutinized but if they got fewer comments then they were just ignored and so you know there's no way out of it you know any result would show that women are being discriminated against somehow wow man i'm glad i don't work where you worked are you happy to be free of that at all i miss the first free food.
[893] Ah, that's hilarious.
[894] Good food there?
[895] Yeah, I like the food a lot.
[896] I had a friend was a big executive over there.
[897] A woman, by the way.
[898] Woman.
[899] Oh.
[900] Running shit.
[901] Yeah, she enjoyed it, but she said it was a mess.
[902] Like, she didn't, you know, obviously have the same issues that you had, but she was like that the whole thing is just chaos.
[903] Oh, really?
[904] Yeah.
[905] Some stupid shit going on over there.
[906] She hated it.
[907] There's definitely.
[908] And I went in a little bit in this in the document, too, where if you have a company that's too progressively run, then it'll be sort of this, you know, everyone's equal and no hierarchy and all chaos and constantly changing, while, you know, the opposite of a really conservative company where there's a lot of hierarchy decisions are made from the top, which may not be, you know, very easy to change things.
[909] so like i mean google is definitely more of the former where there is a lot of chaos and there's multiple teams working on the same thing and it's just this is how we have multiple products that end up doing the same thing and we have to deprecate some right like we have that's very inefficient like four different chat apps well google is in the technology realm but they're not they don't have a lot of competition that's what's really interesting but then they do in certain ways, right?
[910] Like, they do in the phone way.
[911] Like, they put out the pixel, which I bought, which is kind of a fucked up phone.
[912] Oh, really?
[913] Yeah, it doesn't, like, the microphone doesn't work all the time.
[914] Yeah, like, you have to go to speakerphone and bring it back to microphone.
[915] There's a bunch of bugs, so quite a few issues with it.
[916] And then there's the Android operating system, which a lot of people prefer.
[917] So I think they're pretty competitive in that realm.
[918] But, like, when it comes to, like, search engines, though, they don't really have competition.
[919] Yeah.
[920] that's where it gets real sneaky because there's a lot of power in that search search engine and then in gmail what what it well you know what competition they have in gmail yeah yeah i i started using yahoo mail yeah because people were you know really suspicious that google would eventually read my email oh wow do you really worry about that they would spy on your email there were some weird things happening to my phone so i had like what a corp attached to it You had what?
[921] So my corp, so like corp, my work phone basically.
[922] Okay.
[923] And it started like rebooting and after this whole controversy.
[924] Do you have an Android?
[925] Yeah.
[926] And this had never happened before and it hasn't happened since.
[927] And like all these random apps started updating.
[928] It was kind of scary, but.
[929] So do you think they started spying on you?
[930] Is there a way to find out?
[931] I don't know if there's a way to find out.
[932] out but fuck dude i would put my phone aside and bring it to like the top technologist and listen we got to go over this because that'll be giant dude if you found out they were spying on you is there anything in your contract that allows them to spy on you there's some random things where they yeah they can basically just spy on you completely what yeah like how so uh so all of your keystrokes are sort of logged and what at work or on the work computer okay Yeah, not necessarily your personal laptop or anything.
[933] Okay.
[934] What about your phone?
[935] Yeah, so I don't know exactly what they do, but...
[936] Was it a corporate phone that gave you?
[937] Yeah, it had my Google .com account attached to it.
[938] Okay, but it was it your personal phone?
[939] Yeah, I bought it, but then...
[940] Oh.
[941] So, yeah, they reserve the right to, like, completely nuke it and...
[942] What?
[943] Yeah.
[944] They reserved a right to nuke your personal phone.
[945] Now, this corporate phone, are you allowed to use it for, like, say, if you...
[946] you go on a date or you want to buy a movie ticket or something you're allowed to use that phone for that yeah so that's a weird marriage of two worlds isn't it yeah it's some people would own two phones because of that but you know i was i'm a cheap person again like i don't want to have google pay for extra stuff right and i mean i can understand why they would want that because especially you know i was traveling to china for some of my work and you know supposedly if they see that you work for Google, they'll just, like, steal your laptop or your phone, or they won't even explicitly steal it.
[947] They'll, like, go into your room and then install some software on it, and then just put it there, and then the Chinese government will somehow get into Google's networks.
[948] Whoa.
[949] So, I mean, they're rightfully paranoid about some things, but sometimes it's, you know, you don't want to give one entity too much power.
[950] Yeah, my friend who worked for Google was very upset at this whole China thing because essentially she was saying they have to agree to censorship the China censorship and that the only alternative is to let China steal all of what Google's doing and make a fake Google because that's what they were doing apparently like they had to make sure that they didn't allow that and then to do that they had to have certain things like Tiananmen Square he couldn't search for that and it's like a lot of weird shit that they would have to censor any dissent of the government and you know gets very slippery right i mean like yeah you're anti -diverse or you're pro -diversity but you're also you're supporting that like as a company that's a giant issue like to allow china to censor its citizens i mean you're you're essentially promoting a dictatorship in that regard yeah it's sort of a lose -lose i i don't know what exactly they should do i think they just did it for business.
[951] I think they just made a business choice.
[952] It's a fucking scary choice, too.
[953] Yeah, well, I mean, they were in China, and they supported some of this stuff, but then they eventually chose not to because - So they backed out of it.
[954] Is that recent?
[955] That was, I don't know, before my time at Google, actually.
[956] So they decided to get out?
[957] So they're not involved with China anymore?
[958] Yeah, it's blocked by the firewall.
[959] China, blocked Google?
[960] Yeah, and all of Google services.
[961] don't they have some weird thing you can get around that though but that's super illegal if you get around that you're getting like really big trouble yeah although their official policy is that there is no firewall so i don't know if they have any laws to actually imagine that a fucking billion people they figured out how to do that to them yeah i mean i think china's not the only case where this is happening there's other countries where google also has a sensor really yeah like in the middle east there's some countries that do that.
[962] God, man. So it gets really complicated.
[963] Yeah, I can imagine.
[964] Look, I don't envy them.
[965] I don't, and I don't envy any of the people that work there in management that are sort of responsible for putting out, you know, an infinite number of forest fires all around them all the time, social, economic, you know, dealing with different cultures.
[966] It's not, it doesn't seem like it would be an easy gig.
[967] Yeah.
[968] And I mean, one of the words that they have now, too, is, you know, even though they have a large market share for search, they see search as sort of a gateway to the world.
[969] And they don't necessarily have a huge market share for that because, you know, Facebook and Twitter are also ways to get to the world's information.
[970] And a lot of Facebook is just a walled garden where Google can't really get into that.
[971] So some people just, and, you know on your phone you spend most of your time on facebook or something and not necessarily just doing random google searches you know yeah i got off of that i don't really go on facebook for that very reason it seems to me to be the biggest sucker of time that we have i just i feel like twitter to me is like it's limited by 140 characters it seems like pretty straightforward i get links it get interesting stories get sent to me that for my needs that's more it's more it's more appealing.
[972] And then Instagram is very appealing because I like images.
[973] I like to look at pictures and sometimes people write cool captions and find out about interesting shit.
[974] But Facebook is like, woof, boy, you're going to lose a lot of time on that motherfucker.
[975] Yeah.
[976] And that, so this is a random tangent, but, you know, so I worked on image search and they also see that even though there isn't a huge competitor for image search, there's Instagram and Pinterest, which are very similar things.
[977] And, you know, we do our demographic research and we really look into why people are using these products.
[978] And we see that the majority of the users are women.
[979] And, you know, they actually know why that is.
[980] It's that women prefer, you know, art and aesthetics over men and on average, right?
[981] And that's exactly what I had in the document.
[982] I mean, we openly acknowledge this when we're looking at the products, because otherwise, you know, you're not going to give these random ads to people that, you know, if you know that they're a man and you're not going to give them ads for women products, you know?
[983] Right.
[984] So AdSense does discriminate and stereotype people in some ways.
[985] But it's okay.
[986] Yeah, it's, although now they're getting into trying to de -bias machine learning.
[987] So if they do, see any things that the machine learning has learned, these statistical anomalies or just trends in the data, then they'll try to remove that.
[988] Why?
[989] That seems like it would be less effective.
[990] It's less effective, but they see it as...
[991] Discriminatory.
[992] Social justice, yeah.
[993] What a mess.
[994] What a mess.
[995] Bing, you need to step up your game.
[996] Come on, Bing.
[997] Bring back that Windows phone.
[998] Come on, Hotmail.
[999] Microsoft had Hotmail, right?
[1000] Nobody uses Hotmail.
[1001] Does that even real anymore?
[1002] Do they have Hotmail?
[1003] I think so.
[1004] So I've been getting a lot of emails from, you know, pretty paranoid people.
[1005] And some of them are from Hotmail.
[1006] Oh, yeah.
[1007] How about AOL?
[1008] I got an EOL .com.
[1009] What the fuck?
[1010] AOL is real?
[1011] Who the hell has AOL?
[1012] Where are you right now?
[1013] Like, where are you, what do you do with your time?
[1014] Yeah, read books, respond to media.
[1015] requests do you get a lot of them yeah i still get a lot and you know thankfully now some of them are more of the long form which i like a lot more than just the five -minute tv thing yeah i wanted to give you as much time as you could to just yeah talk about this and especially after i heard you on ben shapiro show i'm like this guy's getting the shaft you're a very reasonable person you're not a misogynist at all as far as i can tell you don't seem like a sexist you don't seem cruel you're not like the type of person i think we go out of their way to promote some sort of a quote unquote harmful stereotype gender stereotype it just seems so weird especially like i personally am just very conscious about a lot of these gender stereotypes and you know i use the word they whenever the uh whenever the gender of someone is unknown or just unimportant and like i try to avoid using guys instead of just like you all or something yeah i say folks i try to to say folks now because I used to say guys a lot you know I try to use the term folks yeah but yeah for that very reason yeah and you know if I get married I I would actually try to you know merge our last name somehow don't do that dude not not like the hyphen not the hyphen but just like create a new name yeah create a new name that would be the coolest if if you can do it well you know the former mayor of Los Angeles did that really yeah his name is Tony Valar and his wife had this ethnic name And so they changed it and put it together, and he came Villarago, Villaragos, Villaragos, I think that's what his name was.
[1016] But it made him seem like he was Mexican.
[1017] And so that's why he went with it.
[1018] And he kept it, he after he got divorced.
[1019] Oh, man. Yeah, it's super, like Adam Crowell always shits on him for it.
[1020] I didn't even know about it until he explained it to me. I went, what?
[1021] Like, it's a fake name?
[1022] Because, I mean, I just, I don't, I wouldn't want my.
[1023] wife to, you know, just take my last name and lose theirs.
[1024] I refuse to let my wife use her own name.
[1025] You can't.
[1026] Yeah, Villarago, Villaragoza, Villaragoza, Villaragoza.
[1027] Villaragoza.
[1028] You could make a really cool last name.
[1029] Yeah, that's what he did.
[1030] I mean, he was Valar, Antonio Ramon Valar, Jr., and his wife was Ria Gosa until 2007, and they split up, but he kept that name.
[1031] He kept that fucking ethnic name.
[1032] As long as you don't Google search it, when you do, you go, hey, what?
[1033] What's your fucking dad's name, bro?
[1034] I mean, there are people that have stage names, so it's sort of okay.
[1035] Yeah.
[1036] Nikki Glazer had a very funny joke about that.
[1037] She's a stand -up comedian.
[1038] She had a funny joke about your old name, you know, and that like when a woman gets married, and then like all her name is is when her son gets locked out of his bank account and needs to no mom what was your old name like in terms of like how to access his account with a password yeah i mean yeah it'd be nice if everybody's kept their own fucking name yeah but then what do to the kids that's yes kill the kid to pick you know choose your favorite parent yeah who's your favorite parent have a meritocracy inside your own family now you can't do that right yeah and then if you have it so oh the first child is this second child is that then It just gets too confusing.
[1039] But what if you change your name and then you break up?
[1040] Do you go back to your old name?
[1041] It depends on how cool it is.
[1042] If it's Villaragosa, then you keep it.
[1043] If it's good, it keeps you, ingratiates you with the ethnic markets?
[1044] Maybe.
[1045] Maybe.
[1046] Yeah.
[1047] Tricky.
[1048] I don't know.
[1049] Marriage in the self is very weird.
[1050] It's some sort of strange legal contract with the state that involves relationships, which is just so bizarre, which is why 50 % of them fall apart, you know?
[1051] Yeah.
[1052] And that's, Chris Rock had a great joke about that.
[1053] That's the cowards that stay.
[1054] Like how many, how many of the people stay?
[1055] 50 % left.
[1056] Like, how many people are fucking miserable and they're still involved in that contract?
[1057] I mean, it's 50 % that fail.
[1058] It's a good argument, you know?
[1059] Yeah.
[1060] Is it the 50 % of the initial ones get breaking up or just 50 % of all marriages?
[1061] So, like, there are some that get married 10 times.
[1062] So do they get counted in that 50 %?
[1063] Yes, because initial marriages.
[1064] Like, if you get a union, I do, I do.
[1065] How many of those work?
[1066] 50 % stay unionized.
[1067] Oh, man. Of the first one, that's pretty bad.
[1068] It's not good.
[1069] Yeah, it's not good.
[1070] I'm happily married and I tell people don't do it.
[1071] It's not worth it.
[1072] It's a fuck of ridiculous proposition.
[1073] And if you're, whether you're male or a female that makes a lot of money and the spouse doesn't, then you run into this very weird situation, you know.
[1074] Yeah, it's scary.
[1075] Mm -hmm.
[1076] And, you know, potential of losing custody of kids.
[1077] Yes.
[1078] Yeah, it gets real weird.
[1079] But it makes sense with children because, you know, like, look, creating life is way more of a commitment than divorce and marriage.
[1080] Because you could easily get divorced.
[1081] People do it every day.
[1082] But creating life is like, that's a significant responsibility.
[1083] I mean, it's gigantic.
[1084] You could get along with someone else.
[1085] I mean, you could get divorced and go through all the turmoil and all the stress and then find a new person and maybe they'll be, maybe it'll be better.
[1086] Maybe you marry that person and it'll work out well.
[1087] Maybe you learn from your first relationship.
[1088] But I think the commitment of raising a human being is way more of like a serious long -term responsibility.
[1089] So if you could do that, like you can stay married, you know, work it out.
[1090] As long as the person's reasonable, get a reasonable person.
[1091] Do you know anyone that's getting an arranged marriage or has?
[1092] Arranged marriage?
[1093] Yeah.
[1094] No, I don't.
[1095] Because those actually, I think they stay together more than that 50%.
[1096] Really?
[1097] Yeah.
[1098] So like rich parents get together with another rich family and they bring over their daughter and that kind of shit?
[1099] Or not necessarily just rich.
[1100] I think it happens a lot in more traditional countries.
[1101] Like India, it still happens.
[1102] What is that?
[1103] The divorce surges over, but the myth lives on?
[1104] That's a chick who wrote that.
[1105] I just fucking propaganda.
[1106] That's fake news.
[1107] Get that shit off the screen.
[1108] What does it say?
[1109] I just saw another article on psychology today that said it's down to about 25 % or 75 % survive.
[1110] What?
[1111] Yeah, one in four and a divorce.
[1112] But if you get married a second or third time, the rates go way up.
[1113] Yeah, that makes sense.
[1114] It says it's like a myth, quote unquote, myth from the 70s and 80s, but there also isn't the amount of time if you got married, the last 10 years to say you're going to get divorced in 20 more years.
[1115] I would like to know the actual hard data with the United States of America because culturally it gets weird when you look across the different countries, but what about the United States of America?
[1116] What are the percentage of people who get married who wind up getting divorced?
[1117] Let's find that out.
[1118] What do you say it is?
[1119] Oh, across the world?
[1120] No, no, no, just the United States.
[1121] Oh, just the United States.
[1122] I would think I'm sort of trusting that random.
[1123] You think it's about 25 % get divorced?
[1124] Divorce rate in the U .S. drops to nearly 40 -year low.
[1125] Wow.
[1126] Look at this.
[1127] It represents a jump from 31 .9 in 2014 and is the highest number.
[1128] Okay, 32 .2.
[1129] Okay.
[1130] Marriage rates, on the other hand, have increased.
[1131] There's 32 .2 marriages for every 1 ,000 unmarried.
[1132] But what is the divorce rate?
[1133] 16 .9 for 1 ,000 married women 15 or a row.
[1134] What is a fucking percentage, you fuck?
[1135] 23.
[1136] They're throwing around too many.
[1137] 50 % chance, okay, typical marriages still have a 50 % chance of lasting.
[1138] That's all I said.
[1139] It's the same goddamn number.
[1140] Researchers have found that typical marriages still have about a 50 % chance of lasting.
[1141] That's very fucking...
[1142] I think this is still talking...
[1143] Including the second and third marriages and beyond.
[1144] Well, marriage is marriage.
[1145] I mean, if you have a second marriage, it means you failed.
[1146] It means you got divorced.
[1147] Yeah, yeah, but if...
[1148] Go back to that, Jamie, please.
[1149] It says, researchers have found that typical marriages still have a 50 % chance of lasting.
[1150] That means you have a 50 % chance of not lasting.
[1151] But that's just assuming you look at every single marriage.
[1152] But if you look at the first marriage, then maybe you have a 70 % chance of never getting a divorce.
[1153] Huh.
[1154] And then, but if you do get divorced.
[1155] So you, like, factor it in when people are in, like, the second.
[1156] third and fourth married like those Elizabeth Taylor type folks yeah have like nine or ten marriages and so it might have reduced because people are just getting married later so they're choosing rather than just oh I got pregnant when I was young or I didn't have anything else to do so I got married look at that Hawaii had the lowest it's because it's fucking awesome there maybe if you live somewhere awesome they have universal basic income yeah great weather great marriages what did you put up, Jamie?
[1157] What are you highlighted?
[1158] Just other things that they're same factor in.
[1159] Like, cohabitating has become less stigmatized.
[1160] So, like, not living together but not getting married.
[1161] It's another thing that's happening more.
[1162] Okay, people don't look to marriage shore up an unstable relationship.
[1163] Marriage rates have been declining for years.
[1164] So less people get married, but the percentage is still.
[1165] You also don't have to get married when you have a kid right now.
[1166] That's like you're not rushing to do it.
[1167] Right, right, right.
[1168] It's less stigmatized.
[1169] Yeah, I mean, imagine if that was a friendship thing.
[1170] okay man we with best friends or what let's fucking go to court dude do this i mean it's just it's weird i mean it really is but it does it makes sense for some people they like it people like rituals you know it feels good to you know like say it and do it and make it real and jump over the broom like they did in roots yeah i i think one interesting thing that i was looking into a little bit was um the rates of divorce for homosexual marriages because that's also sort of interesting.
[1171] So one thing about heterosexual marriages is women initiate the divorce in 70 % of cases.
[1172] Of course they do.
[1173] Which is, you know, I wouldn't have necessarily predicted that.
[1174] Really?
[1175] So.
[1176] Don't you think women get pissed more?
[1177] Like, fuck you.
[1178] Don't you?
[1179] Yeah, I guess.
[1180] Relationships that you've been in, have women been pissed off at you more than you've been pissed off at them or the opposite?
[1181] Definitely they get mad at me. Sexist.
[1182] You're a goddamn.
[1183] promoting harmful gender stereotypes.
[1184] I generally just don't get that angry, so I think that's part of it.
[1185] That's probably why they get mad at you.
[1186] You don't even fucking care.
[1187] And so like not to push on the one thing that sort of hurt me, but the neuroticism trait has been linked to unstable marriages.
[1188] And women having more of that has been part of the explanation for why.
[1189] And so you see in lesbian couples, they also have a higher.
[1190] rate of divorce than gay couples?
[1191] The gay men?
[1192] That's interesting.
[1193] So, like, part of it is women want to settle down much faster, so they'll move in within like a month.
[1194] And then, which, you know, obviously is too soon to know whether or not that's a long -term relationship.
[1195] Well, especially if neither one of you are flexible and you don't sort of adapt to each other's needs and desires.
[1196] So I interrupted you, though, what is the percentage of gay men?
[1197] How often do they get divorced?
[1198] I don't know the exact numbers, but it's lower than the lesbian one.
[1199] And because, you know, we only have the last few years or so.
[1200] Of gay marriage.
[1201] Yeah, so we don't know the actual long -term rates.
[1202] But it's very interesting because, you know, it's a world that I don't know much about.
[1203] Right.
[1204] Has, um, how do you come through all this?
[1205] Like, do you feel damaged by this at all?
[1206] Do you feel like your name has been besmirched?
[1207] Definitely, you know, I just went to a party with my friends.
[1208] And, you know, some of them I were as much closer to and I had already talked to about this.
[1209] Some I hadn't.
[1210] And, you know, you never know how they felt about it.
[1211] Right.
[1212] So some, it seemed like they, you know, I could tell that they were like, oh, yeah, man, hey, what's up?
[1213] But some.
[1214] Finally broke.
[1215] One of us, bro.
[1216] Keep it tight.
[1217] Secret handshake.
[1218] But others were maybe a little averse to me. So that may happen in the future.
[1219] Prejudices that they have going into the conversation.
[1220] And oh, I didn't know that you were a sexist.
[1221] Wow.
[1222] Have you gotten that?
[1223] Have you gotten people outright insulting you?
[1224] There was one person that was just on the road.
[1225] Just F you.
[1226] Really?
[1227] Yeah.
[1228] On the road?
[1229] Where was this?
[1230] In Mountain View, so...
[1231] You were driving your car?
[1232] I just got out of my car, and they just yelled at me. Male or female?
[1233] It was a guy.
[1234] And what did he say?
[1235] Just F you.
[1236] Just fuck you for what?
[1237] For being that.
[1238] How do you know that that's why he said that?
[1239] Well, yeah, I mean, I've never really been yelled at except by, like, crazy people on the streets.
[1240] What this guy looked like?
[1241] Just normal nerd guy.
[1242] Did he have his girlfriend with him?
[1243] Yeah.
[1244] He was trying to impress her?
[1245] Maybe.
[1246] Oh, that cunt.
[1247] That piece of shit.
[1248] I want to smack him.
[1249] God.
[1250] But most of the support, or at least, like, personal interactions have been in support of me. So there have been random people like, oh, are you, James Timor?
[1251] It's like, yeah.
[1252] Wow.
[1253] So one guy with a girl yelled out, fuck you.
[1254] And he looked like a nerd.
[1255] She'll dump him.
[1256] Don't worry about it, buddy.
[1257] It'll all come around.
[1258] I had a friend that was like that.
[1259] He was super fucking male feminist.
[1260] And then eventually his spouse went nutty on him and crazy.
[1261] And then all of our friends went nutty on him.
[1262] And now he's like, he man, woman haters club.
[1263] He went the other way.
[1264] Oh, really?
[1265] Not totally.
[1266] But he's like, what the fuck, man?
[1267] I'm like, yeah, you can't just rely on a gender to be cool.
[1268] You have to rely on individual human beings and their personalities and their actions.
[1269] and their character.
[1270] I can't believe we have to actually go over this.
[1271] But no, like, all in all, men aren't great.
[1272] All in all women aren't great.
[1273] You find unique people that are cool in all sorts of groups.
[1274] Yeah, and, you know, once you start aligning yourself with one of these groups, and, you know, if you ever go against any of their principles and, you know, they're constantly changing and getting more extreme, then you'll eventually get ostracized.
[1275] And maybe that's what happened.
[1276] that that's a big issue with the left sure you know left eats itself but i don't think that's as much of an issue with the conservative right you know with like rational conservatives not like racists and like full right wing nuts but you know i think what people just want they they want harmony i think overall right you know they want to succeed and they want harmony which sometimes are mutually exclusive yeah and you know a lot of people just don't acknowledge that you know most people are normal and they're just yeah want to live their life and you know even though they might have voted for Trump or something they're not some evil person right yeah they're not the KKK which I I've met a lot of people in Silicon Valley that basically equate voting for Trump and being in the KKK yeah yeah that's harmful that's really splitting groups and you know if you're going to build products that are for the entire world, then you really need to understand other people.
[1277] Especially, you know, a lot of the world is actually more conservative than, I mean, Europe may be more liberal than the U .S. in some ways, but a lot of Asia and Africa and South America is more conservative than we are.
[1278] So we need to at least understand what's happening and what their worldview is.
[1279] Yeah, the idea that everyone who voted for Trump is in the KKK so crazy.
[1280] But it's convenient to demonize the other.
[1281] Yeah.
[1282] We would love to do that.
[1283] We love to look at groups and just block ourselves off and this is us and we're on the right.
[1284] And these people on the other side, they're incorrect.
[1285] And it's a real normal, common tendency that human beings have that we should be very, very aware of, but we're not.
[1286] You know, we have these convenient blinders that we put on.
[1287] whenever we're engaging in any sort of ideological discussions where our belief systems might be challenged.
[1288] We dig our heels in and like this is in.
[1289] I think you see a lot of that with the left with this whole like you cannot be progressive enough.
[1290] You know, like it's like they're getting whackier and wackier with it.
[1291] It's just really weird.
[1292] Yeah, no concept of, you know, okay, I'm an ally with you on this thing, even though we may disagree on this other subject.
[1293] And that's just completely impossible in their head.
[1294] Yeah.
[1295] But I don't think there's enough real discussions going on in this world, too.
[1296] I think people are a lot of times following these predetermined patterns of behavior.
[1297] They think they're expected to follow as a progressive or as a conservative.
[1298] And then they just go with it.
[1299] And then when they do engage with someone who has a differing opinion, then it becomes a, in quotes, game again.
[1300] It's trying to win rather than trying to understand, like, what this person sees and what they think and what what is your philosophy like how are you approaching this and trying to be like really open minded about it yeah like I see this even in myself when I'm talking to someone and you know maybe they're a feminist or like extreme in some way and like I'll discuss them and I'll immediately just stereotype them as someone that's even more extreme and I'll read into their words of oh you said that that means that you mean this and even though you know maybe it's important to at least show what the extreme outcome would be and therefore we can't just take this on principle but you know everyone does it and yeah it's really hard to not do it it is hard to not do it it's one of the reasons why i think like long -form conversations are so important and how often do you ever sit down like this with someone and talk for a couple hours with just you and the person talking not looking at your phone not checking the TV not known no no one we very rarely do this I think this is one of the only ways we could really work out ideas especially when you're talking to someone that might have a differing opinion but they also might be intelligent and you might be able to like sort it out like let me parse out what your thoughts are and see where I differ and how you got to where you got and maybe I'll have a better understanding of your philosophy and but there's a lot of people that don't even have a philosophy they just it sounds good so they just go with this predetermined pattern that's easy to follow you know as a left -wing progressive I feel this I mean I've heard people say that before like as a Democrat I've always felt like oh we as a Democrat how about as a fucking person yeah that's you know it's not it's um ideas are hard you know thoughts on life and how we cohabitate and how we move through this fucking existence together it's very difficult to work out we're just so many variables so many styles of human you know there's just so many different things that we have to work through together and to try to do that based on patterns that other people have established and that you cannot break and then you know have these like i mean that's one of the reasons why it's so ruthless to say that all white people have some implicit biases that they may are not may not even be aware of and and this uh you know unintended racism flavors all conversations like you're just poisoning this conversation you're poisoning this conversation with this fucking fishing line it's all tangled up now we're going to have to figure out what is real and pull this apart and get it back on the spool you know yeah and there's no solution for some of those too where you know you just say there's some boo boogeyman type thing that's yeah controlling all this and there's some conspiracy that we we can't really see and we can't point out specific examples, but it's ever -present.
[1301] And I think, yeah, a lot of the treating people as individuals, that has become more of a libertarian thing.
[1302] Yeah.
[1303] And so it's hard to, at least for me, to understand some of this more collective thinking and social conformity, which I've never been a fan of.
[1304] Do you think it was a good idea to write that memo?
[1305] Like if you had to go back again, if you were in front of your computer and you're ready to press send, would you?
[1306] Maybe I would wait for my year -end bonus.
[1307] I mean, I think I would have pushed harder, even harder on the diversity programs, although like I met with them personally and I kept pinging them and I sent so many emails to them just trying to have a discussion about this.
[1308] And I went through multiple other programs and sent this document, this exact document, to them.
[1309] So it's really unclear of what I could have done differently.
[1310] But, I mean, I, for example, I didn't know so much about the underground conservative network before all of this.
[1311] At Google, you mean?
[1312] Yeah, and even within Silicon Valley.
[1313] Like, there's attempts to sort of connect them between companies, but there's, like, so much verification that you need to go through to be able to join one of these.
[1314] Do you have to have a pseudonym?
[1315] You don't need to be totally anonymous, but you know, you don't want, because there's active attempts to try to infiltrate these groups.
[1316] Really?
[1317] Yeah.
[1318] This happens a lot where they'll try to join a group, act as if they're one of them, and then just, you know, record what's happening and then expose them.
[1319] Whoa.
[1320] And, you know, you can take anything out of context, and it would be shown as, you know, racism or something.
[1321] Oh, for sure.
[1322] Well, I mean, think of what we've said, well, not you, but me, choking around in this conversation, you could clearly take something I've said out of context and make it look like I'm a monster.
[1323] But if you're in an email and you're complaining about some sort of diversity program.
[1324] Yeah, like what they often do is they will find someone that they disagree with, and then they'll scour through their entire history at Google and all the emails that they've sent and try to look for some way to blacklist them or show that this person is evil, therefore they should be fired.
[1325] It's horrible.
[1326] Supposedly, this is happening in other companies too, and they even have these automated scripts to try to find these negative things on people that they don't like.
[1327] Wow.
[1328] So a little psychological covert warfare.
[1329] Yeah.
[1330] And that's also going to contribute to people towing the line, right?
[1331] Yeah.
[1332] want to keep their job like look at you just said maybe you would have waited until you got your end bonus i mean people i mean and you are a single guy right yeah so that that definitely helped where i don't have as many responsibilities imagine if you did you know you probably wouldn't have said anything you would thought about it and go you know what i have to worry about my family and taking care of my bills and oh so weird man yeah i mean the worst part is these people think that they're doing the right thing right like censoring people and finding these people is the right thing because those people are wrong yeah so that's how they think they think okay i have everyone sees the world the same therefore anyone that just disagrees with me is either misinformed or a misogynist bigot right otherwise how could i have possibly said those things yeah when really you know i people with different political ideologies see the world differently and they have different biases and you know none of them are totally correct but we need to be able to discuss things to show you know more objective view of the world without a doubt the fact that that is even up for debate is it's very strange i mean that's a ideological echo chamber and that seems like for whatever reason that seems like where tech is and that's where technology company seem to lean towards this very left wing ideological echo chamber yeah and I mean I saw it a lot too on the comments of the document where I said oh yeah these are just biases and no they're like no the right is indoctrinated they're just KKK and they're anti education they're anti poor people they're anti everything No. Not all of them.
[1333] And, you know, at least the way I see it, and not being, you know, total conservative, I can't necessarily say, but it seems like they don't necessarily hate poor people or anything.
[1334] They just think that these certain incentive structures are what's best for society.
[1335] And, you know, it's not best to promote or they think that some things will lead to laziness or something.
[1336] Right.
[1337] And that's not saying, oh, yeah, these people are just.
[1338] horrible people.
[1339] They actually want to help everyone and they think that these social norms and government programs may be hurting people.
[1340] Yeah, I mean, there certainly are some people that are right -wing that think like that, and then there's some people that are right -wing that are really racist.
[1341] Yeah.
[1342] They exist too.
[1343] And there's some people that are left -wing that are really racist, and they're really racist towards white people.
[1344] I mean, that's, there's white people that are racist towards white people.
[1345] I mean, I've read so many fucking tweets from people that, you know, Like, I follow a bunch of anti -social justice warrior accounts, and they'll find people that tweet, like, really horrible shit about white people that are white.
[1346] Yeah.
[1347] That's like, I get what you doing.
[1348] Just try to get those brownie points.
[1349] Try and super hard to get, you know, people of color to love you.
[1350] And, you know, as an ally, it's just very strange time.
[1351] I think a lot of it has to do with this new found ability to communicate that just really did not exist in the past.
[1352] If you wanted to get controversial ideas passed, you know, to massive groups of people in the past, you had to write a book or you had to get an article published.
[1353] You had to be some sort of a major media distribution center had to take your work and put it out there for the people.
[1354] Right.
[1355] Now that's not the case.
[1356] So now you get a lot of like really fragile or really poorly thought out ideas.
[1357] And as long as you can hit the nerve of enough retards, you can get those fucking things out.
[1358] And then they start promoting.
[1359] I mean, that's where the flat earth movement is coming from.
[1360] I mean, what is that other than that?
[1361] I mean, that's exactly what it is.
[1362] It's enough people that just don't have a sense of the importance of critical thinking skills or are not used to objectively assessing ideas.
[1363] And then they coalesce in these groups that are like -minded.
[1364] And you can get that with racism.
[1365] You can get that with sexism.
[1366] You can get that with pretty much anything.
[1367] Right.
[1368] You get these like -minded groups, they get together, and they have confirmation bias, and they get an ideological echo chamber, and they start reinforcing each other, you know.
[1369] Yeah.
[1370] And definitely, I mean, it affects who you follow, and then you just assume, oh, yeah, everyone thinks like this, therefore it must be right.
[1371] Yeah.
[1372] And, I mean, it's really a shame, though, that this is happening even in, you know, the pursuit of knowledge in academia, where so many people have a certain worldview, like the social social sciences have 90 % of people lean left.
[1373] And that's, that can create its own confirmation biases.
[1374] And especially when, you know, it's, it's definitely bad, like in tech where 20 % of people are women and they can feel alienated.
[1375] But at least, you know, overt signs of sexism are seen as bad.
[1376] Right.
[1377] But overt signs of discriminating against people based on their political orientation is seen as okay.
[1378] and people do it and so there's a big asymmetry there where you actually feel it's justified to maybe it wouldn't be as big of an issue if we had a reasonable Republican president maybe if we had someone who was like really kind and rational like maybe a Mitt Romney type who seemed like far more reasonable and you know it's the sort of you know, I mean, we have a bunch of issues, obviously, as a country now with this guy's president.
[1379] And I think that we're also dealing with really an infant stage of information distribution, like the ability for anyone to mass distribute anything is so, like, anyone can create a, you know, a YouTube video.
[1380] And if it strikes a chord, it can hit a million people like that.
[1381] It's never been a time like that before where, you know, I mean.
[1382] And the incentive structures are all out of whack where it's better to be outrageous than it is to be honest.
[1383] And that's causing a lot of our headlines to just be, you know, oh, he's just a sexist, bigot.
[1384] And, you know, there's no room for nuance.
[1385] But also, don't you think that there's a lack of time that people have to, like, just, like, I told you how much time I spent going over your stuff.
[1386] And after a while, I was like, what the fuck am I doing?
[1387] I don't even work in tech.
[1388] But most people don't have that kind of time, nor do they have that sort of obsessive mindset.
[1389] They look at the surface of something.
[1390] Oh, this guy wrote a sexist memo about women in tech.
[1391] Fuck him.
[1392] It's probably a misogynist.
[1393] And they just march towards their meeting.
[1394] And we have to avoid the kind of thinking that led to someone thinking that it's okay to write the Google memo.
[1395] And then everyone like, yes, here, here.
[1396] I want my year -end bonus.
[1397] I'm with you.
[1398] I think as the dust settles, we will get more and more truth out of people.
[1399] And I think there's a general trend with information to have information be easier and easier to distribute.
[1400] That's one of the most important things about technology, right?
[1401] The instantaneous access to information.
[1402] And right now that information is not entirely verifiable.
[1403] Like, some of it is and some of it's not.
[1404] And that's one of the more disturbing things about people reprinting your memo without citations.
[1405] I was like, hey, like, you fuckers, you left out a big part of what this is.
[1406] Like, what you did is really wrong.
[1407] Those citations, maybe people won't go into them.
[1408] Maybe they won't read the studies.
[1409] Maybe they won't.
[1410] I mean, it takes a long time if you really want to get involved in that.
[1411] But there will be a better version of that in the future.
[1412] I think they will.
[1413] I think that's where the, I mean, obviously I don't know, but I think that's where the trend is.
[1414] I think the trend is leaning towards more and more honest interpretation of facts and ideas.
[1415] And then, you know, we'll be left with some things that we have to look at that we can't just write off to sociology or right off to culture or right off to biases or sexism or racism.
[1416] We're going to have to look at things for what they really are.
[1417] And maybe we'll have a better understanding of why we behave the way we do, why.
[1418] we have the problems that we have.
[1419] Yeah, and part of the issue, though, is if someone controls access to information and they want a certain narrative to be told, then, you know, it'll really color what people see, and that's what's scary.
[1420] And, you know, we see this a lot in YouTube now where they're demonetizing anyone that they see as right -wing and even censoring and removing videos.
[1421] is it's really scary.
[1422] It is.
[1423] It's fascinating.
[1424] I mean, it's, it's quite fascinating to watch it all play out and to have them do it like right in front of everybody's face.
[1425] And everybody goes, what are you doing?
[1426] Like, you're changing narratives.
[1427] You're altering information.
[1428] And they feel like they are right.
[1429] Right.
[1430] They're doing the right thing.
[1431] They're promoting diversity.
[1432] They're promoting liberal values and progressive ideas.
[1433] and they think they're doing the right thing.
[1434] I don't necessarily think they're right, though.
[1435] Yeah.
[1436] There's a lot of blowback, though.
[1437] I mean, this is not a free ride for Google right now with what they've done to you.
[1438] I mean, I'm sure there's been, I mean, I'm sure they're doubling down because they don't want to admit they're fucked up.
[1439] If they admit they're fucked up, everybody across the board loses that year -end bonus.
[1440] It becomes a real issue, right?
[1441] Everybody gets fired.
[1442] But if they, you know, if you look at it long term over the long run, I mean, they have definitely taken a hit.
[1443] And if someone forces them to sit down, I would love to sit down with the guy who said that you promote harmful gender stereotypes and go, let's go over this thing.
[1444] Let's go over this thing step by step.
[1445] You tell me what's wrong.
[1446] Yeah.
[1447] And just pick them apart.
[1448] That's what I've always wanted.
[1449] He'll fall apart, 100%.
[1450] He'll just say a bunch of stupid social justice warrior bullshit.
[1451] And if you just keep them in a room for three hours with a microphone, it's going to look like a fucking idiot.
[1452] There's just no way around it.
[1453] you know there's no way around it if you're just active if you're actually going off of what you wrote to somehow or another like i think it's very not just dangerous to say it promotes harmful gender stereotypes it's disingenuous and why the reason why it's dangerous is because i could just read what that guy said and i would think that you're a creep and that's dangerous to you yeah it's it's it's dangerous towards the the marketplace of free ideas the marketplace of ideas is it It's extremely important.
[1454] And I would think that if anybody would know that, it would be the people that are involved in tech.
[1455] You would think so.
[1456] Yeah.
[1457] I mean, they're just so wrapped up.
[1458] Just so wrapped up in the progressive mindset.
[1459] It's weird, man. Yeah, I mean, it's so related to all this microaggression, you know, speech is violence and all ideas are harmful.
[1460] And, of course, you know, some ideas are harmful.
[1461] But it's only through openly.
[1462] discussing them can you actually dispel some of these things by making them you know forbidden knowledge that's only going to attract certain people and you know we even see this now where some of the YouTube videos that are in this purgatory type state where you know you can't really get to them but if you know the URL you can still find yeah yeah people are getting aggregated lists of those and actually viewing them yeah and oh this is what YouTube doesn't want us to see maybe there's some truth to it why don't they want to see it yeah if you win a certain amount of money are you willing to buy a gold -plated Ferrari and drive around with a fur coat because I think they'll do shit you got like some big ass crazy sunglasses uh yeah I don't think you can win I don't really know and I mean I'm trying to I mean what I would ideally want is somehow changing their policies but I don't really know how I as an individual can you know compel Google to do something like that but I think at least some of the stuff like the blacklisting where they have these people that you know compile these spreadsheets of names of people that are conservative right or even libertarian and oh we're not going to work with them we're going to sabotage their work and we're going to try to get them fired when they are looking for another job we're going to share this list so they can't get hired from any of the other major companies.
[1463] Like, that needs to.
[1464] So that's real.
[1465] Yeah, that's real.
[1466] How do you know that that's real?
[1467] Have you seen this list?
[1468] Yeah.
[1469] So there have been multiple people that have admitted to having a blacklist.
[1470] Wow.
[1471] Libertarian, not even conservative, not even right wing, but smaller government, libertarian.
[1472] Yeah, just because, I mean, it's generally free -thinking people.
[1473] Yeah.
[1474] not towing the party line.
[1475] And those people get blacklisted.
[1476] Really?
[1477] Like there's an actual list somewhere?
[1478] Have you seen an actual list?
[1479] I haven't seen an individual list.
[1480] I think there's multiple lists spread out.
[1481] But people, even like high up managers, have admitted to having a blacklist.
[1482] Wow.
[1483] And, you know, we've brought this up to the highest people at Google, and they just completely dismissed it.
[1484] we're not going to deal with it so do you feel like they feel that they have some sort of a social responsibility to push progressive values because they're in this massive position of influence and they feel like that's the right way to think so they're going to go full steam ahead yeah and don't be evil don't be conservative pretty much yeah hmm but libertarian man boy it's a fucking tough sell to say that gary johnson's evil you know i don't know Well, yeah, it's really hard to understand that mindset.
[1485] Yeah.
[1486] Well, I get it, though, because I think it's a lot of the same things along the same lines that you were talking about when you were saying that you didn't, you know, like maybe you would have waited until you got your year -end bonus.
[1487] And you're a guy who's also frugal.
[1488] You've saved your money and you don't have a family support and you're okay.
[1489] You know, you got fired and you're still.
[1490] Okay.
[1491] Whereas some people would be fucked right now.
[1492] Maybe they'd be overextended.
[1493] Maybe they had that gold Ferrari in the fur coat and like, shit.
[1494] Yeah.
[1495] I mean, if I had a mortgage or something, that would be really scary.
[1496] That's where it gets scary.
[1497] That's a lot of people's decision making.
[1498] I mean, that's, that goes back to, you know, engineering civilization in the early days of Rome.
[1499] I think there was writings about that, about getting people to commit to families and it's easier to control them when they have loved ones and, you know, and things.
[1500] that they enjoy in positions of power and status, that it's easier to get those people to give into your needs and desires.
[1501] Yeah, I mean, it makes sense, right?
[1502] I mean, it's just engineering a civilization.
[1503] It's one of the, like, getting people to perform and behave the way that you would like them to is a critical component of engineering any sort of a civilization.
[1504] And Google's essentially a civilization.
[1505] If you look at it that way, I mean, internally, there's a community.
[1506] It's a structure.
[1507] And they're engineering that structure to be very much a like -minded ideological echo chamber.
[1508] Yeah, and I think it's really going to bite them in the back at some point.
[1509] Like they're making the easy decision of not really facing the, you know, the truth as I see it.
[1510] And, you know, if you turn your back on that for too long, it's really going to have negative consequences later yeah well i i feel like one thing it's super important to point i think we kind of already did but women do experience a lot of sexism and again it's because like i said men are gross you know this is a lot of gross men and men working in close proximity with women i mean men working with other men they're going to find things they don't like about those men you know i mean people have interpersonal relationships are fucking gross and messy and if men work with women and they feel like they can dominate them with aggression or with some sort of weird tactics that play on the agreeableness that females seem to have.
[1511] You know, it's a problem.
[1512] And I think by not looking at that, by not being honest about that, we do just as much of a disservice.
[1513] Yeah, although, I mean, I would say that, you know, there are men that are just as agreeable and just as much of a pushover say.
[1514] Yeah, sure.
[1515] And, you know, they also get shunned and push aside.
[1516] And sometimes it's even worse for men that fit that stereotype or don't fit the typical male stereotype because, you know, there's negative consequences on both sides for not being masculine if you're a man or not being feminine enough if you're a woman.
[1517] Yeah, like you're not allowed to just be yourself, right?
[1518] You have to, like, fit, you're better off you fit into some sort of a classic narrative.
[1519] yeah so where do you go from here besides suing the fuck out of google google just give him some money just shut them up do you want to go through a lawsuit like what if they came to you with a settlement would you just take it and shut your mouth i really want somehow for them to address it but i don't know how to do that well even if they if you lose in court will they address it they'll probably say you know although we support the court we disagree with the rulings and we still support gender equality and blah blah blah blah blah yeah i mean i think part of it is that there's currently an asymmetry so maybe google is acting in their best interest to act the way that they are because they think that you know there's all these activists that are trying to attack google that only if they don't fit this certain party line are there a lot of activists that are attacking google in that regard yeah and you know we even see that there's now a potential of class action lawsuit against Google for gender pay disparity and so like they just are looking for anything and if we say that you know if there's only incentive coming from one side then they're only going to push farther and farther to that side and this gender pay disparity is this involving similar jobs yeah so they claim that it's the same job although at least when google was doing their own internal analysis, which they've been doing for years, they show that there's no disparity once you control for performance.
[1520] And so it's really unclear.
[1521] But you control for performance, performance tends to favor males?
[1522] Maybe.
[1523] I mean, if that's what they're showing, that there is some sort of gender disparity if you just look at aggregate.
[1524] Look at this.
[1525] One in 100 million chance alleged gender pay gap at Google is random.
[1526] says class action lawyer oh jesus class action lawyer says that in the article's written by a check fake news fake news you're not going to get me you fucks i mean one thing is i don't think that they really have google's internal data so there's no way for them to say whether or not it's based on the performance look what they're they're saying here you know um notices seeking women currently or formally employed at Google for possible inclusion in a planned class action lawsuit.
[1527] First of all, people hear that and they're like, we're going to get paid.
[1528] We're going to sizzler, right?
[1529] I mean, that's just, you're playing on human instincts when you seek out people that may have been employed for a possible inclusion in a class action lawsuit.
[1530] That's not saying that they weren't wronged, because obviously I don't know.
[1531] several dozen came forward in a matter of weeks.
[1532] That's a pretty high level of dissatisfaction, says James Feinberg.
[1533] No, it's not.
[1534] No, there's fucking thousands of people have worked there, and a couple dozen came forward.
[1535] That's not a high level of satisfaction.
[1536] How many people have been employed at Google that are no longer employed?
[1537] It's probably tens of thousands, right?
[1538] Yeah, there's 70 ,000 people working there now.
[1539] Okay.
[1540] So for this guy to say, that's a pretty high level of dissatisfaction when several dozen.
[1541] Let's say three dozen.
[1542] Let's go crazy.
[1543] Let's say it's 40 people.
[1544] Let's get nuts.
[1545] That's fucking nobody, man. Oh, 70 women.
[1546] Five Biggers heard from.
[1547] But wait a minute.
[1548] Heard from.
[1549] That doesn't, you mean, they might not even make sense.
[1550] That might not be a case.
[1551] Four.
[1552] Four people.
[1553] Four!
[1554] That's not a lot, you fuck.
[1555] The class action.
[1556] I mean, that's just, this is this is a fucking ambulance chaser right i mean i'm not saying he's wrong i'm not saying there's not sexual discrimination but i'm saying like these articles are sneaky as fuck four people you got four people and i don't know how an individual would know whether or not they're paid differently just based on their sex right because there's so many variables at play so you really have to look at the system at as a whole because i mean there are definitely some men that are paid less than the women too When you control for performance, the problem is when you control for performance, if it turns out that men are being paid more, then you have to figure out some sort of a way to justify that.
[1557] Or, you know, like, if men are being paid more when you control for performance, what is it that's causing the men to be paid more?
[1558] Why are they performing better?
[1559] Like, is it the environment?
[1560] Are they more comfortable?
[1561] Is it lack of suppression that the women experience?
[1562] Yeah.
[1563] So I guess when you look at the nationwide gender gender.
[1564] gap in pay where you know even Obama has said 77 cents per dollar is too little yeah but he's a silly person like he shouldn't have done that like he knows like when Obama said that he knows that that's not yeah being honest because you're talking about completely different jobs different choices for people don't know okay let's just break that down real quick this this thing because people repeat it ad nauseum and it's just not true the gender pay gap of 77 cents to a dollar that a male makes is based on the choices that people make as far as like what they do for a living it's based on the amount of hours that they work men tend to work longer hours women tend especially if they get pregnant all those things are factored in that's where you get 77 cents on average for the dollar that the male makes what it implies and this is where it's disingenuous is that two people working side by side doing the same job and the males getting 77 or a dollar for the woman's 77 cents that's not what the gender pay gap actually means and if Google is actually if someone is saying if there's a lawsuit that's saying that a man and a woman are doing the exact same job with the exact same performance and the woman is only getting 77 cents on the dollar then you got a real issue right yeah and so it's often or uh that you know there's different hours worked and it doesn't even have to be that you know they work twice as many or 30 percent more it's Sometimes if you just work, you know, 44 hours a week versus 34 hours or something, then there's a huge pay disparity.
[1565] Yeah.
[1566] And that's irrespective of what gender you have.
[1567] It's just, you know, especially at Google, there was so much time that was just, you know, replying to email and doing some base level stuff, going to meetings.
[1568] And then you only had a little bit that was actually creative and providing value to the company.
[1569] Really?
[1570] Yeah.
[1571] So it's really inefficient in that regard.
[1572] Right.
[1573] And, you know, it's similar in a lot of companies, too.
[1574] So, and that creates some of this, you know, nonlinear benefits of working just a little more per week.
[1575] And, you know, we see this a lot in Silicon Valley where there's a lot of people right out of college and they're willing to work a ton of time, especially, you know, you can essentially live at Google.
[1576] So there's not, I mean, yeah, there's free food everywhere.
[1577] there's showers, there's a gym.
[1578] They have beds?
[1579] There's nap pods.
[1580] Nap pods?
[1581] Are they closed off so you can't hear anything out there?
[1582] Can you actually sleep in there good?
[1583] I can't really sleep in there just because I'm too tall.
[1584] But most people, yeah, you can sleep in there because they like close it off and you can just lay there.
[1585] Is there a term for being discriminatory towards tall people?
[1586] Yeah, tallest.
[1587] heightist, heightest?
[1588] Like, there's ableists, right?
[1589] If you mock people that aren't able to do things, you become an ableist.
[1590] There's definitely a movement now of looking into, oh, maybe tall people have some advantages, because we see that a lot of CEOs are over six feet tall, and it's not clear why exactly that is.
[1591] And you really have to control for every aspect, because there is, you know, correlations between height and intelligence, but it's likely not just that, you know.
[1592] But what I'm saying is, like, they're discriminating against you with the pods.
[1593] Hook you up at a fucking seven.
[1594] How tall are you?
[1595] Like, six, four?
[1596] Six -three, yeah.
[1597] Hook you up with a six -four pod, man. We can stretch your legs out and get a good nap.
[1598] Maybe it'll be more productive at a job you don't work out anymore.
[1599] Yeah.
[1600] I just never really felt like complaining too much.
[1601] Good for you.
[1602] Except for that one thing.
[1603] Yeah, that one thing.
[1604] So are you trying to seek other employment?
[1605] or are you?
[1606] Yeah, I'm still looking at, you know, what exactly I want to do because I never was, you know, coding wasn't the thing that I was doing my entire life.
[1607] Is that what your education is in?
[1608] No, so I was doing, you know, physics and biology, random math stuff, and I just picked up some algorithm books, and they seemed really cool.
[1609] So I started doing some coding competitions, and I did well enough that Google just, like, randomly contacted me. Wow, how weird.
[1610] and especially since you're a white male this is back in the day before they figured out how they discriminate yeah this was all online and I had a username so maybe they didn't know oh that's interesting so they contacted you and offered you employment based on your coding skills yeah that's weird oh that's cool so now you're just trying to figure out what the next how old are you 28 so you're still a very young man yeah you got to figure out what the path's going to be huh yeah you're leaning in one way or another uh something that uses my brain but yeah that'll be nice something outside of tech maybe or what maybe or i i still feel like you know tech in general is for the future and we'll have huge impact on the world sure something related to tech but maybe not coding all day but i really don't know because you know most of the major silicon valley jobs probably a blacklist in me. Really?
[1611] Yeah.
[1612] That's unfortunate, man. Because like I said, I've read your memo.
[1613] I don't think you did anything wrong.
[1614] I think you took a bold choice and a bold stance to talk about something that's essentially taboo, but you did it with science.
[1615] You know, and you did it, I think you did it in a very reasonable manner.
[1616] You know, and I'm shocked that the reaction has been as extreme as it's been.
[1617] But I'm not shocked at the same time.
[1618] I mean, it's predictable almost.
[1619] And the people calling you a misogynist, it was very weird.
[1620] And CEO of YouTube saying it hurt her when she read that.
[1621] Like, you're going to have a rough life.
[1622] Yeah, I mean, I thought that, you know, the first, the intro, which talked about all these political biases and how art culture shames people that give a different view, I thought that might have shown that, you know, maybe we shouldn't.
[1623] be doing that but you know predicted exactly what happened to me i think very few people actually read it probably yeah yeah especially like globally very few people it's very click baity i think you know the responses to it are very click baity and people go with whatever the titles of the articles that are criticizing you and just just accepted as gospel you know yeah i've gotten a lot of responses that were just oh yeah i saw it on facebook you know some sexist memo and and You know, it was only after they saw that so many times and they decided to read it that they finally were like, oh, no, it's not that bad.
[1624] Yeah, I urge people, if you have the time, just please just read it.
[1625] Just go over it and try to figure out where it all went wrong.
[1626] I'm glad you did it, though, man. I mean, it's a really interesting point of discussion, and I hope this lawsuit works out well for you.
[1627] And I hope Google just comes to their senses, and I don't think that's going to happen.
[1628] But, you know, what do you predict?
[1629] What do you think is going to happen?
[1630] I don't know.
[1631] I think that, you know, people now are aware of this a lot more.
[1632] And there may be platforms that emerge that are sort of, you know, alt tech is what they're calling it, just alternative technology that's more open to just free speech.
[1633] But unfortunately, they're currently just being labeled as white supremacist sites.
[1634] And it's, I hope.
[1635] Hopefully people can see through that.
[1636] I don't know.
[1637] If they have the time to even look, that's the thing.
[1638] It's just like they're taking everyone's word for everything.
[1639] It's very odd time, but there's enough people discussing it.
[1640] And I think the response to your memo has been, it's been very enlightening for some people.
[1641] From a sort of a psychological standpoint, like what are the reactions that people have and why do they have these reactions and what does it say about us as human beings that this is such a taboo subject that we can't even address the very real differences that we have as unique individuals, you know?
[1642] Yeah, I'm at least happy that it didn't happen during college season because then there would be like protests and people burning my effig years.
[1643] Do you think so?
[1644] I think it would have been a much more negative if it was during the school year.
[1645] Wow.
[1646] And they would demand that their school, you know, double down on diversity and just all these things.
[1647] Yeah, a lot of virtue signaling going on.
[1648] Yeah.
[1649] It's at least it's nice to see that some of the colleges have been standing up for it or against it.
[1650] And, you know, saying, no, you can't really just tell us what to do.
[1651] And we believe in, you know, knowledge and actually seeking the truth.
[1652] and not just criticizing people based on their political ideologies.
[1653] Yeah.
[1654] It's a long slog, my friend.
[1655] There's a lot of walking and talking going on.
[1656] But I think we'll be fine.
[1657] I hope we'll be fine.
[1658] If we don't go to war with North Korea, get smashed by 100 fucking hurricanes in a row.
[1659] But I think we should just, what's the matter?
[1660] I got a tweet that you're a badass in chess.
[1661] Oh, you're a chess master?
[1662] Yeah, I played a lot of chess.
[1663] Yeah?
[1664] That was my life for a few years.
[1665] Can you play chess in your head?
[1666] Yeah.
[1667] Wow, that's fascinating.
[1668] I used to play chess in my head against like four different people, so like blindfolded.
[1669] Whoa, dude.
[1670] I knew this kid who was a chess master, and it was a pool hall that I used to go to, and he used to play with this ex -con, and the ex -con learned how to do chess in prison in his head with no pieces.
[1671] And him and this kid would just sit there and play chess back and forth with each other.
[1672] And I was like, what are you guys doing?
[1673] How do you know where the board is?
[1674] You could play it blindfolded with four people in your head.
[1675] Yeah.
[1676] Wow, that's intense, man. How'd you learn to do that?
[1677] Just repetition over time?
[1678] Yeah, doing it a lot and just obsessing over it.
[1679] And this is actually one of the differences and average between men and women is there are more men that just become obsessed with these systems.
[1680] And so, you know, Magic the Gathering, the card game, was also something that I became super obsessed with.
[1681] And so the way that people approach computers, too, is different, where, you know, some or a lot of boys just approach the computer as a toy and they become obsessed with tinkering with the computer while a lot of girls see it as a tool for improving the world.
[1682] And so they may not be interested in the computer as an end to itself.
[1683] And so a lot of the education programs to get more women into tech are actually addressing that.
[1684] But it's unclear because so much of coding is just writing server code and this server is going to talk to this server, which is talking to that server, and it's totally unconnected to actual people.
[1685] But that's why we actually see more women in front end and user experience.
[1686] engineering positions because it's more interactive with people what are the numbers with women in chess yeah there aren't that many and it's unfortunate uh is it unfortunate because it's just it is i mean yeah yeah i don't play chess is that unfortunate so why is it unfortunate that women are underrepresented well just for the cases of maybe they feel like a minority and you know right And a lot of the mistreatment of women is not, you know, ill -intentioned men that want to be sexist against men.
[1687] It's just, and I felt this a lot, you know, everyone wants a girl to play chess or play magic, the gathering.
[1688] You know, that's their ideal girlfriend, right?
[1689] Right.
[1690] And so, and but they're all nerdy guys or generally, and they don't have as good social skills as the average population.
[1691] And, you know, they're pretty similar to the people playing or writing code.
[1692] So it's a similar situation.
[1693] So they just don't know how to interact with women.
[1694] And that causes some problems.
[1695] But so it's not the just overt sexism against women.
[1696] It's more just we don't really know how to interact with women.
[1697] We just, you know, we're obsessed with chess or whatever.
[1698] And we just like talking about chess.
[1699] And there are a lot of women that just aren't as obsessed with these sort of systems.
[1700] But, I mean, that's not a bad thing either.
[1701] It just is what it is.
[1702] Yeah.
[1703] I mean, I'm sure there's a bunch of fashion things and aesthetic things and design things that women are really into that a lot of men don't give a fuck about.
[1704] Yeah.
[1705] It's not a terrible thing that men aren't into design.
[1706] There's not more men involve in interior design.
[1707] It's not a terrible thing.
[1708] And that's one of the unfortunate things too is that there's so much fighting to get more women into tech, but there's no fighting to get more men into nursing or any of these more female -dominated careers.
[1709] Do you think that's also because of the financial rewards of tech are so extreme in comparison?
[1710] Like nursing is a pretty capped salary, whereas if you can climb the corporate ladder, as a, you know, a CEO of some sort of a tech company, the rewards are substantial.
[1711] Yeah, I think inevitably there will be more men attracted to high -paying jobs simply because they fight for status and money is how you gain status often.
[1712] So that's partly why they see tech as a target.
[1713] But it's not as if nursing is a bad job.
[1714] Right.
[1715] That gets paid well.
[1716] And there are many people that, you know, go to college for pre -med and drop out.
[1717] like 90 % of people that start as remand drop out and the men feel like they can't enter nursing because that's too feminine and there's huge biases against men becoming feminine like you know men can't wear dresses but girls can be openly tomboyish right right so there's unfortunately some asymmetries in our culture and there's reasons for it you know if a guy is too feminine then he can't necessarily fulfill his gender role, which is being a provider and protector.
[1718] So, you know, you have to be aggressive to be a good protector and provider for your wife.
[1719] But the female's gender role being a nurturer is, you know, it's fine to be feminine.
[1720] And so a lot of the gender disparities that we see and gender norms are just put behind, those two gender roles.
[1721] Yeah, I think there's a lot of evidence to support that.
[1722] And I think that was essentially a big part of what you were talking about in your memo.
[1723] And I don't think you're a bad guy, dude.
[1724] And I think you've been unfairly maligned.
[1725] And I'm glad we had a chance to sit down and talk.
[1726] And I wish you well, man. I hope it all works out.
[1727] And keep us posted.
[1728] And we'll let everybody else know, too, okay?
[1729] Oh, yeah, yeah.
[1730] Thank you, James.
[1731] Appreciate it, man. Very nice to meet you.
[1732] Thank you.
[1733] All right, folks.
[1734] It's it for today.
[1735] Bye, bye.
[1736] How long was that?
[1737] Almost three hours.