The Joe Rogan Experience XX
[0] people though they really are it's just fucking hard business especially when you didn't see it coming two one hello rene hello thanks for doing this i really appreciate it thanks for having me uh i listened to you on sam harris's podcast and i was utterly stunned i had to listen to it twice because i just couldn't let's get into let's get into this from the beginning how did this start out how did you start researching these uh online russian trolls and bots and and all this jazz Yeah, so a couple of years back in around 2015, I had had my first baby in 2013 and I was getting them on these preschool lists.
[1] And what I decided to do was I started looking at anti -vaccine activity in California because I had a kid and I wanted to, you know, put them on preschool lists where I was going to fit with the parents, basically, as someone who vaccinates.
[2] And I started looking at the way that small groups were able to kind of disproportionately amplify messages on social challenges.
[3] And some of this was through very legitimate activity, and then some of it was through really kind of coordinated, deliberate attempts to kind of game ways that algorithms were amplifying content, amplifying particular types of narratives.
[4] And I thought it was interesting, and I started writing about it, and I wound up writing about ways in which hashtag gaming, ways in which people were kind of using automation to just be in a hashtag all the time.
[5] So it was kind of a way to really gain control of share of voice.
[6] and what that meant when very small groups of people could achieve this kind of phenomenal amplification and what the pros and cons of that were.
[7] And then this was 2015.
[8] So the way that this sort of awareness of social media challenges came about was actually when I was working on this, other people were looking at it from the same tactics, but how they were being used by ISIS, by the terrorist organization.
[9] And there also, you had this very small group of people that managed to use bots, and amplification to really kind of own a narrative, really push this brand, this digital caliphate to kind of build it on all social platforms almost simultaneously, and the ways in which information was hopping from one platform to another through kind of deliberate coordination and then also just ways in which information flows kind of contagion style.
[10] And I wound up working on thinking about how the government was going to respond to the challenge of terrorist organizations using American social platforms to spread propaganda.
[11] So what we came to realize was that there was just this information ecosystem and it had evolved in a certain way over a period of about eight years or so and the kind of unintended consequences of that.
[12] And the way that Russia kind of came into the conversation was around October 2015 when we were thinking about what to do about ISIS, what to do about terrorism and terrorist, you know, kind of proliferation on social platforms.
[13] this was right around when Adrian Chen had written the article, The Agency for the New York Times.
[14] And that was one of the first big exposés of the Internet Research Agency.
[15] The first time an American journalist had gone over there and actually met the trolls, been in St. Petersburg, and began to write about what was happening over there.
[16] And the ways that they had pages that were targeting certain facets of American culture.
[17] So while we were in D .C. talking about what to do about terrorists using these platforms to spread propaganda, there were beginning to be rumblings that, Russian intelligence and, you know, Russian entities were doing the same thing.
[18] And so the question became, can we think about ways in which the Internet is vulnerable to this type of manipulation by anyone and then come up with ways to stop it?
[19] So that was how the Russia investigation began.
[20] It was actually around 2015, a handful of people started looking for evidence of Russian bots and trolls on social platforms.
[21] So 2015, if we think about social media and the birth of social media essentially it had only been alive for i mean what was twitter 2007 i believe something like that so eight years like eight years of social media and then all the sudden they figured out how to game this system and then they figured out how to use this to make people argue against each other yeah so i think so there was this um if you go back to like um you remember like Geo Cities and they, yeah, sure.
[22] Okay, AOL, use that.
[23] Yeah, of course.
[24] So we're probably about the same age.
[25] So there have always been, you know, kind of, the thing that was great about the internet, like internet 1 .0, we can call it, right?
[26] It was this idea that everybody was given a platform and you could use your platform, you could put up your blog, you could say whatever you wanted.
[27] You didn't necessarily get attention, but you could say whatever you wanted.
[28] And so there was this kind of consolidation, as social platforms kind of came into existence, content creators were really excited about the fact that now they not only had this access to write their own stuff, but they also had access to this audience, because as the network effects got more and more pronounced, more and more people came to be on social platforms.
[29] And it originally wasn't even Facebook.
[30] If you remember, it was like, you know, there's like Friendster and MySpace and social networks kind of evolved.
[31] When I was in college, Facebook was still limited to like, you know, a handful of like Ivy League schools.
[32] And so I wasn't even eligible.
[33] And as you watch this consolidation happen, you start to have this information ecosystem really dominated by a handful of companies that grow very large because they're providing a service that people really want, but there's a kind of mass consolidation of audiences onto this handful of platforms.
[34] So this becomes really interesting for regular people who just want to find their friends, reach people, spread their message, grow an audience.
[35] It also becomes really interesting for propagandists and trolls and in this case terrorist organizations and state intelligence services because instead of reaching the entire internet they really just kind of have to concentrate their efforts on a handful of platforms.
[36] So that consolidation is one of the things that kind of kicks off some of the one of the reasons that we have these problems today.
[37] Right.
[38] So the fact that there's only a Facebook, a Twitter, an Instagram, and a couple other minor platforms other than YouTube, I mean anything that you can tell it's an actual person.
[39] Like YouTube is a problem, right?
[40] Because you could see it's an actual person.
[41] If you're, if you're narrating something, you know, if you're in front of the camera and explaining things, people are going to know that you're an actual human being.
[42] Whereas there's so many of these accounts that I'll go to, like I'll watch people get involved in these little online beefs with each other, and then I'll go to some of these accounts, I'm like, this doesn't seem like a real person.
[43] And I'll go and it's like hashtag MAGA, there's an American Eagle in front of a flag, and then you read their stuff and you're like wow this is this is probably a russian troll account and it's strange like you feel like you're not supposed to be seeing this like you're seeing the wiring under the board or something and then you'll go through the timeline and all they're doing is engaging people and arguing you know for trump and against you know whatever the fuck they're angry about whatever whatever it is that's being discussed and they're they're basically just like some weird little argument mechanism.
[44] Yeah, so in 2016, there was a lot of that during the presidential campaign, right?
[45] And there were, there was so much that was written.
[46] You know, we can go back to the free speech thing we were kind of chatting about before.
[47] There was so much that was written about harassment and trolling and negativity and these kind of hordes of accounts that would brigade people and harass them.
[48] Of course, a lot of that is just real Americans, right?
[49] There are plenty of people who are just assholes on the internet.
[50] Sure.
[51] Um, but there were actually a fair number of these as we began to do the investigation into the Russian operation in, uh, and it started on Twitter in about, um, 2014, actually.
[52] So, 2013, 2014, the internet research agency is targeting Russian people.
[53] So they're tweeting in Russian at Russian and Ukrainian folks, people in their sphere of influence.
[54] So they're already on there.
[55] They're already trying this out.
[56] And what they're doing is they're creating these, uh, these, these accounts.
[57] It's kind of wrong to call them bots because they're, are real people.
[58] They're just not what they appear to be.
[59] So I think the unfortunate term for it has become like cyborg, like semi -automated.
[60] You know, sometimes it's automated.
[61] Sometimes it's a real person.
[62] But a sock puppet is the other way that we can refer to it, a person pretending to be somebody else.
[63] So you have these sock puppets and they're out there and they're tweeting in 2014 about the Russian annexation of Crimea or about MH17, that plane that went down, which Russia, you you know, of course, had no idea what happened, and it wasn't their fault at all.
[64] And gradually, as they begin to experience what I imagine they thought of was success, that's when you see some of these accounts pivot to targeting Americans.
[65] And so in late 2014, early 2015, you start to see the strategy that for a long time had been very inwardly focused, making their own people think a certain way or feel a certain way or have a certain experience on the Internet.
[66] it begins to spread out.
[67] It begins to look outwards.
[68] And so you start to see these accounts communicating with Americans.
[69] And as we were going through the data sets, which the Twitter data set is public, anyone can go and look at it at this point.
[70] You do see some of the accounts that are kind of, you know, that were somewhat notorious for being really virulent nasty trolls, anti -Semitic trolls going after journalists, you know, some of these accounts.
[71] being revealed as actually being Russian trolls.
[72] Now it doesn't kind of exulpate the actual American trolls that were very much real and active and part of this and expressing their opinion, but you do see that they're mimicking this.
[73] They're using that same style of tactic, that harassment to get at real people.
[74] And if they do get banned, if their account gets banned, they just simply make another account.
[75] They use some sort of, you know, What is it, a virtual server?
[76] What is that called?
[77] You mean VPNs?
[78] VPN, that's it.
[79] Yeah, so if they do that, they can kind of do that as long as they want.
[80] They can continue to make new accounts.
[81] And it probably also emboldens the actual American trolls because they're going to go out a little bit further than everybody else, a little bit crazier.
[82] And it kind of changes the tone of discourse within these communities that are arguing about a certain subject.
[83] Things get nastier.
[84] and they're getting nastier because of the interference of these trolls.
[85] Like it seems like they've actually managed to not just cause a lot of discourse, but to change the way people are interacting with each other and to make it just make it more vicious.
[86] Yeah.
[87] So what they're doing is they're operating in communities.
[88] So one of the really common criticisms of people who, a lot of people think that this didn't have a huge impact, didn't, you know, did it swing the election?
[89] We have no idea.
[90] But what it does do in the communities that it targets is it can change that tone.
[91] And that's where you see, I mean, I think everybody's probably had this experience.
[92] You're part of a group and then a new person gets added to the group and the dynamic changes.
[93] It's very much the same kind of thing, just that these are not real people who are joining the group.
[94] And so there's this opportunity to, you know, kind of expand the bounds of tolerance, just that little bit more.
[95] or try to normalize using particular ways of communicating that maybe a group wouldn't naturally gravitate to, but then it does.
[96] So there are definitely ways in which any type of troll doing this doesn't have to be a Russian troll has this ability to kind of shift the language, shift the community, shift the culture just a little bit.
[97] Now, why did the agency do this?
[98] And do we know, do we have someone who's ever left there or become a whistleblower who can give us some information?
[99] about what the mandate was and how it was carried out?
[100] There have been a couple of whistleblowers and actually some investigative journalism in Russia that's covered this.
[101] They describe the employees of the Internet Research Agency.
[102] So it's a little bit like a social media marketing agency plus tactics that we would not expect a social media marketing agency to use, things that are a little more like what you would expect to see from an intelligence agency.
[103] So besides just making your pages and your blogs and your social posts, they're also in there kind of connecting with real people and real activists and pretending to be something that they're not to develop kind of a one -on -one relationship.
[104] But most of the whistleblowers who have come out, there's a woman named Ludmila Savichuk.
[105] She wrote an expose, I believe, on this.
[106] And it's described as being much like you would expect if you were doing social media grunt work.
[107] You have a certain number of posts per day.
[108] You know, you're trying to get a certain amount of engagement.
[109] You're trying, you've got to kind of hit your quotas.
[110] Most people are young millennials, the people that work there.
[111] They're well -versed in trolling culture.
[112] They're well -versed in internet culture.
[113] You know, they're up to speed on, like, popular memes and things like that.
[114] And so you do see this, and then the other thing that they do is they talk about in Mueller indictment, you see some really interesting descriptions of like the standups that they have.
[115] Standup is a thing you do at a tech company where everybody kind of stands up and talk about your goals and responsibilities and blockers and things.
[116] And in these standups, they would be sitting there saying things like if you're targeting black LGBT people, make sure you don't use white people in your image and your meme because that's going to like trigger them.
[117] So trying to get at the very niche rules for, you know, for communicating authentic.
[118] authentically in an American community, which, as you know, online, you know, sometimes there are very specific ways in which a community expects a member of that community to communicate.
[119] Yeah.
[120] And so they are in there, and you can read in these filings by Mueller's team and by the Eastern District of Virginia, the degree of granularity that they have to recognize that if you are running a black LGBT page and your meme is of white people, you're going to cause some tension and consternation.
[121] And assuming that that's not necessarily what you want to be doing, you should go find the meme of black LGBT people to put in the, you know, to put as your meme for the day.
[122] So there's a lot of, there's a lot of sophistication.
[123] There's a lot of understanding of American culture.
[124] And then there's a lot of understanding of trolling culture.
[125] And so these things combined to be a rather effective, you know, very effective social media agency.
[126] And is there an overwhelming sort of narrative that they're trying to pursue, they're trying to push?
[127] So what we saw, so I did the, I did some of the research for the Senate, and the Senate data came from the platforms.
[128] So what I had was the attribution was made by the platforms.
[129] It wasn't like Renee deciding this was IRA.
[130] It was the platforms giving it to our government.
[131] And the information in there, what it showed was that across all platforms, across Twitter, across Facebook, Instagram, YouTube, there were building up tribes.
[132] So they were really working to create distinct communities of distinct types of Americans and that would be, for example, there's an LGBT page that is very much about LGBT pride.
[133] There's...
[134] And they created it.
[135] And they created it.
[136] And they curate it and they...
[137] Create it, curate it.
[138] It has a, you know, there's like a persona.
[139] A lot of the posts on the LGBT page were written by what sounded kind of like a millennial lesbian was the voice.
[140] So it was a lot of you know, memes of LGBT actresses, and they would brand it with a specific brand mark.
[141] It was a rainbow heart.
[142] LGBT United was the name of the page.
[143] It had a matching Instagram account, which you would also expect to see from a media property, right?
[144] You would expect them to see in both places.
[145] And this, you know...
[146] What were they pushing?
[147] It read like a young woman talking about crushes on actresses and things, actually.
[148] You know, it was really, besides the sometimes wonky English, very...
[149] and distinguishable from what you would read on any kind of, like, young, millennial -focused social page.
[150] It wasn't, none of it was radical or divisive.
[151] It wasn't like, the way that they got the division across was they built these tribes where they're reinforcing in -group dynamics.
[152] So you have the LGBT page, you have numerous pages targeting the black community.
[153] That was where they spent most of their energy.
[154] A lot of pages targeting.
[155] far right.
[156] So both old far right, meaning people who are very concerned about what does the future of America look like, and then young far right, which was much more angry, much more like trolling culture.
[157] So they recognize that there's a divide there, that the kinds of memes you're going to use to target younger right -wing audiences are not the same kinds of memes you're going to use to target older right -wing audiences.
[158] So there's a tribe for older right -wing, younger right -wing.
[159] In the black community, there's a Baptist tribe.
[160] There's a black liberation tribe.
[161] There's a black women tribe.
[162] There's one for people who have incarcerated spouses.
[163] There's a brown power, I believe was the name of it, page that was very much about Mexican and Chicano culture.
[164] There was Native Americans United.
[165] And all of these are fake.
[166] All these are fake.
[167] All these are fake.
[168] And what are they trying to do with all these?
[169] So you build up this in -group dynamic and they did this over years.
[170] So this was not a short -term thing.
[171] They started these pages in 2014 -2015 time frame most of them they started some other ones that were much more political later we can talk about the election if you want to but with this tribal thing you're building up tribes so you're saying like as black women in America this is here's posts about things that we care about here's posts about black hair here's posts about child rearing here's posts about fashion and culture and then every now and then there would be a post that would reinforce like as black people we don't do this and so or as LGBT people we don't like this and so you're building this rapport so like me and you we're having a conversation we're developing a relationship on this page over time and then I say like as this kind of person we don't believe this so it's a way to subtly influence by appealing to an in -group dynamic or appealing to like as members of this tribe as LGBT people of course we hate Mike Pence as black people people, of course we're not going to vote because, you know, we hate Hillary Clinton because we hate her husband.
[172] As, um, as people who are concerned about the future of America as Texas secessionists, you know, so, so everything is presented as members of this tribe, we think this.
[173] As members of this tribe, we don't think this.
[174] But a lot of the posts, sorry, but a lot of the posts were not even political.
[175] They were just sort of affirming the standards of the tribe.
[176] Yes.
[177] So they were kind of setting up this whole long game.
[178] Yep.
[179] And then once they got everybody on board, how many followers do these pages have?
[180] So there was kind of a long tail.
[181] There were, I think, 88 pages on Facebook and 133 Instagram accounts.
[182] And I would say maybe 30 of the Facebook pages had over 1 ,000 followers, which is not very many.
[183] And then maybe the top 10 had upwards of 500 ,000 followers.
[184] So there's, you know, the same way you run any social campaign, sometimes you have hits, sometimes you have flops.
[185] Right.
[186] And what was interesting with the flops is you would see them repurpose them.
[187] So they would decide, you know, the same way if you're running a social media agency, well, we've got this audience.
[188] This page isn't doing so well.
[189] It's like rebranded a little bit, change it up, try to make it appeal to somebody else.
[190] So you do see this.
[191] There is a, there is, I got this data set and I was going through these Instagram memes.
[192] and 133 ,000 of them.
[193] And I was, there was a cluster of images of Kermit the Frog.
[194] I was like, what the hell is Kermit the Frog doing in here?
[195] And so then I go, so the way the platforms provide the data is I got like a CSV of the posts and then I got a folder of the images.
[196] And so in order to like connect the dots, I had to have the image up on one screen and the, this thing, the CSV up on the other screen.
[197] CSV?
[198] It's like a spreadsheet.
[199] Okay.
[200] Yeah.
[201] And I, and we, you know, turned it into a database that we could track things a little bit more easily across the platforms.
[202] But, um, so I have this cluster of Kermit the Frog memes and I go and I look and I realize that the, they're attributed to an account called Army of Jesus.
[203] And I thought, well, that's interesting.
[204] You know, these are, some of them were really raunchy.
[205] It was like, it was like Kermit Miss Piggy, like, you know, I mean, it was just like, like stupid, uh, stupid crappy memes.
[206] Um, you know, um, attached to Army of Jesus and what the hell is going on here?
[207] I keep going through it hundreds of Kermit memes and then I get to a post where they say like this page is owned by Homer Simpson now Kermit went to jail for being like I don't know they made some like some joke it was stupid and all of a sudden the data set turns into Homer Simpson memes so again like this kind of raunchy Homer Simpson culture and again it's attributed to Army of Jesus and then I go through all this and realize that they didn't get to actually making Army of Jesus a Jesus -focused page until like 900 posts in.
[208] So they just renamed the account at some point.
[209] It used to be called Nuts News.
[210] And then they Nuts News was what they called it when it was the Kermit the Frog meme page.
[211] And then it gets repurposed when they realize Kermit's not doing it.
[212] It's not getting the audience they want.
[213] Homer Simpson's not getting the audience or engagement they want.
[214] And then they pivot over to Jesus.
[215] And then all of a sudden they start, you know, the likes and things start pouring in.
[216] So what they're doing is they're actually like either deliberately or they're just creating placeholders.
[217] It's kind of a red flag when a brand new account that was created yesterday suddenly starts talking about some highly politically divisive thing or whatever.
[218] But if you lay the groundwork and you do it over a period of two years, then somebody who goes and checks to see what the account was where it came from, how old it is, is going to see something that was two years old.
[219] So it's an opportunity to create almost like sleeper accounts where you create them now and then you activate them, you politicize them, you actually put them to use a couple of years in the future.
[220] So we saw all kinds of, you know, we saw this over and over again.
[221] There was a Black Guns Matter account that turned into an anonymous account at one point.
[222] They were pretending to be anonymous, you know, the hacktivist.
[223] Yeah, so they repurposed this Black Guns Matter page, which just had, it was advocating that black people.
[224] buy weapons and carry and it's like a pro second amendment page but for the black community and they took that page when it wasn't getting I guess a ton of engagement and it became it was called um oh gosh I don't remember the exact name of the anonymous page and I don't want to say it was something that's legit but but they pivoted into an anonymous page and when they do that do they go back and repurpose the content of the earlier post do they change that was not Edit them?
[225] That wasn't clear.
[226] We didn't get that information from the platforms.
[227] There was a lot of stuff that I would have loved to have more insight into.
[228] We could see, again, you know, you'd think if you started following an Army of Jesus page and you had all this raunchy Kermit shit from like a year ago, that would raise some flags.
[229] I would assume that they scrubbed it and restarted, but I don't know.
[230] Your podcast with Sam changed how I look at a lot of the pages that I actually follow, because I follow some pages like that have classic cars or something like that.
[231] And then I'll see them, and most of it is just photographs of cars, like beautiful old cars, and it'll have, you know, they'll have a giant following, and then all of a sudden something will get political.
[232] And I'll look at it and go, oh, wow, like, this is probably one of those weird accounts.
[233] Like, they're getting people to get engaged with it because it represents something that they're interested in, like classic muscle cars.
[234] And then they use it for activism, and they use it for, to get this narrative.
[235] cross.
[236] I think I mean, I've seen it happen with some of mine too.
[237] Um, I think one of the challenges is like you want people to be aware that this stuff exists, but you don't want them to be paranoid that it's everywhere.
[238] I am paranoid.
[239] I know.
[240] That's a problem.
[241] Everybody's a troll now.
[242] I look at this all day long and sometimes I see things and I'm like, you know, what are the odds?
[243] But and I and I try to, you know, not feel like, you know, you don't want to feel like you're in some like Tom Clancy, novel, but the, um, it's, it's this balance between when you make people aware of it.
[244] And I think people deserve to be aware of it.
[245] They deserve to understand how this plays out.
[246] The flip side of that is you do wind up in these weird, you know, you see it happen on social media now or, um, click into a Trump tweet and you'll see like, you're a Russian bot.
[247] No, you're a Russian bot.
[248] Like, they're probably not Russian bots.
[249] You know, everybody who don't like on the internet is not a Russian bot.
[250] Yes.
[251] I'm just asshole.
[252] Exactly.
[253] And so that's where you go.
[254] get at the interesting conversations of, you know, in some ways, getting caught, this is, this is one of the challenges with running disinformation campaigns, right?
[255] It makes it really hard for people to know what's real after the fact.
[256] It leaves you a little bit off balance, right?
[257] Feel like, you know, when you feel like you can't quite tell what's real.
[258] And that's part of the goal, right?
[259] It's to make you not have a, not feel entirely balanced in, um, in your information environment.
[260] Is this real?
[261] Is this not?
[262] And so in some ways, there's not much downside to doing this right.
[263] Because you do, you know, if you either knock it out of the park and you influence the election and you influence people and you have this secret covert operation going on for years, um, or you get caught and then there's a, you know, until there's some confidence in the ability of platforms to detect this stuff.
[264] There's real concern among everybody that you're encountering something fake.
[265] Now, the overwhelming narrative is that the Russians were very much invested in having Trump win, right?
[266] And if they were very much invested in having Trump win, was the reason why they focused so heavily on the African American community, because the African American community traditionally seems to vote Democrat?
[267] So they were trying to.
[268] to do something to break that up or trying to do something to weaken the position of the, you know, the incumbent or Hillary Clinton and maybe put some emphasis on Jill Stein or some alternative candidates?
[269] Yeah.
[270] So the way that the political campaign, the political aspect of it played out, so they established, they started building these relationships in 2015.
[271] And, you know, they're doing this tribal thing.
[272] We've got our in -group.
[273] We're part of this community.
[274] And then what you start to see them do is early there actually there was a tiny tiny cluster in the early primaries where they were supporting Rand Paul and then they pivot to Trump pretty quickly and you and probably Rand Paul just didn't poll well and they were like there's no way to get any lift here but but maybe Trump was getting you know some actual lift in the media and so so you see them move into supporting Trump and then for the remainder of the data set for from 2015 through the end which was mid -2017 or so is when this thing ends.
[275] It's adamantly pro -Trump on the right.
[276] And on the right, you see not only pro -Trump, but you see them really working to like erode support for mainstream or traditional Republicans, traditional conservatives.
[277] You see a lot of the memes about like, are you with the cuckervatives or the conservatives?
[278] And so the cuckervatives, of course, are like they've got pictures of like Lindsey Graham and John McCain.
[279] They hate John McCain.
[280] John McCain shows up a million times.
[281] And Marco Rubio, Ted Cruz.
[282] Well, I think that they, you know, one of the theories is, and I believe this is probably true, they really strongly disliked Hillary Clinton because there was concern that she would, you know, things that she was saying about increasing freedoms in Russia were very threatening.
[283] They thought the best bet to get sanctions removed was Trump.
[284] So they had specific outcomes that they were hoping for.
[285] And that was one of, you know, so there's always like a political motivation.
[286] So there is this narrative around they just want to kind of like screw with American society, create divisions, amplify divisions.
[287] When you look at the political content, the clear and sustained support for Trump and even more than that, the clear disdain for Hillary Clinton, there is not, on Facebook and Instagram, there was not one single pro -Hillary post.
[288] There were some anti -Trump posts because if you're running an LGBT page, of course they're going to say negative things about Trump, you know, and they're saying it.
[289] So you should vote for Jill Stein.
[290] There was early support on some of the left -leaning pages for Bernie Sanders.
[291] But you actually see the support for Bernie Sanders come in more after it becomes clear that he's not going to win.
[292] Because then they're using Bernie Sanders as a way to say this was stolen from him by the evil Clintons or Jill Stein.
[293] You know, here's a true, a true, independent, real liberal.
[294] We should be voting for her if we want to support a woman.
[295] So there are these feminism pages really pushing this narrative of Jill Stein.
[296] So you have the left leaning pages, totally anti -Clinton, and then you have the right -leaning pages, staunchly pro -Trump, and also strongly anti -Cruise, anti -Rubio, anti -Lindsayam, basically anti -every, now what's called establishment Republican.
[297] And there's this kind of pushing of people to opposite ends of the political spectrum.
[298] So this is where you get at the conversation around facilitating polarization.
[299] so not just it wasn't enough to just support Donald Trump it was also necessary to strongly disparage the kind of traditional conservative moderate center right in the course of amplifying the Trump candidacy does that make sense yes it does it does a lot of stuff yeah it is a lot of stuff but it does make sense and one of the things that was really bizarre to me watching the election and I was trying to figure out is this because Trump is so bombastic and he's so outrageous and he's just a different person that the way I was describing it on stage was that like finally the assholes have a king because they never had a king before like everyone who was running for president was at least mostly dignified I mean basically it's really difficult to go back in time and find someone who isn't find someone who there's no one who insults people like he does I mean he insults people appearances.
[300] He calls them losers.
[301] He called Stormy Daniels horse face.
[302] I mean, he says some outrageous shit.
[303] So part of it was me thinking like, wow, maybe he's just ignited and emboldened.
[304] I actually had this conversation with my wife today.
[305] She was like, it feels like racism is more prevalent.
[306] Like it's more, it's more accepted.
[307] People feel more emboldened because they're in their mind, they think he is a racist.
[308] I can get away with more things.
[309] Trump is president.
[310] Like there's actually videos of people saying racist shit and saying, hey, Trump's president now, we can do this.
[311] So I was thinking that, well, maybe that's what it was.
[312] It's just sort of like some rare flower that only blooms under the right conditions.
[313] Poof, it's back, right?
[314] But when you think about the influence that these pages have had in establishing communities and this long game that they're playing, like the LGBT pages, even though they're shitting on Trump, they really want to support Jill Stein because they know that'll actually help Trump because it'll take votes away from Hillary Clinton.
[315] That they, it seems different, like political discourse, discussions online, and social media, the way social media reacted, I mean, there was a lot of people that were anti -Obama before, you know, either of his elections that he won, but it seemed different.
[316] It seemed different to me than this one.
[317] This one seemed like, like we had moved into.
[318] another level of hostility that I'd never experienced before and another level of division between the right and the left that I had never experienced before.
[319] And like a willingness to engage with really harsh, nasty comments and just to dive into it, you would see it all day.
[320] I mean, there were certain Twitter followers that I think they're pretty much human beings, but I would follow them and they would just be engaged with people all.
[321] day long, just shitting on people and criticizing this and insulting that.
[322] And it seemed like, it seemed dangerous.
[323] It seemed like things had moved into a much more aggressive, much more hostile and confrontational sort of chapter in American history.
[324] If this is all done at the same time that this is happening, how much of an influence do you think this IRA agency had on all this stuff.
[325] That's the question that we would all like the answer to and I unfortunately can't give it.
[326] In your mind though.
[327] Yeah, let me kind of caveat that.
[328] The thing that we don't have that nobody who looks at this on the outside has is we can't see what people said in response to this stuff.
[329] So I've looked at now almost 200 ,000 of these posts is what I spent most of last year doing was this recent.
[330] search.
[331] And we can see that they have thousands of engagements, thousands of comments, thousands of shares.
[332] We have no idea what happened afterwards.
[333] And that's the problem.
[334] So once the stuff comes down, it's really hard to go back and piece it together.
[335] So I can see that there are, some of per your point, the really, really just fucking horrible troll accounts that they ran.
[336] They didn't necessarily have a lot of followers, but you see them in there like adding people.
[337] So they're, you know, at, and then the name of a reporter, at the name of a prominent person.
[338] And so they're in their kind of like draft on the popularity of, you know, famous people basically.
[339] And they're just saying like horrible shit.
[340] And it's, the tone is so spot on.
[341] And one thing that was interesting with a couple of them is like if you go and you look at their profile information, which was also made public, they would have like, they would have a, like a gab account in their, in their profile.
[342] That was like, so they would, so it was a remarkable.
[343] So it was a remark piece of kind of the culture in which you see that like they're actually sitting on gab too right and so they can also go and they can draw on there in reddit there's you know 900 or something troll accounts were found on reddit they're on tumbler and so they're just picking the most divisive content and they're pushing it out into communities and at the same time we can see that they're doing it but we can't see what people do in return we can't say did they just block did they have the fight back did was there a huge huge, you know, when this happens on a Facebook page, and they're doing something like telling black people not to vote.
[344] As black people, we shouldn't vote.
[345] What do people say in response?
[346] And that's the piece that we don't have.
[347] So when we talk about impact, a lot of the impact conversation is really focused on, did this swing the election?
[348] We don't have, nothing that I've seen has the answer to that question.
[349] The other thing is, but the second question, the thing, when I think about impact, I think from, I think you and I agree on this, it also matters how does this change how people relate to each other.
[350] And we have no real evidence of, you know, we have no information on that either.
[351] This is the kind of thing that lives in some, you know, Facebook has it.
[352] The rest of us haven't seen it.
[353] Now, are most of these people, is this mostly Facebook?
[354] Is it mostly Twitter?
[355] Where does, how does it break down?
[356] Yeah.
[357] So there were, here are my like little stats here because I don't want to give you the wrong data.
[358] There were, 10 .5 million tweets, of which about 6 million were original content, created by about 3 ,800 accounts.
[359] There were about 133, I mean just read it, 133 Instagram accounts with about 116 ,000 posts, and then 81 Facebook pages and 17 YouTube channels with about 1 ,100 videos.
[360] And so they got about 200 million engagements on Instagram and about another 75.
[361] million or so on Facebook.
[362] Engagements are like, like, shares, comments, reactions, you know.
[363] So it's hard to contextualize, but what we think happened, you know, you can go and you can try to look at how well did this content perform relative to other real authentic media targeting these communities.
[364] And what you see with the black community in particular is their Instagram game was really good.
[365] So on their Instagram accounts, the top five, three of them targeted the black community and got, you know, tens to 100 millions of engagements.
[366] So I would have to pull up the exact number.
[367] Is it mostly memes?
[368] Don't know it off top of my head.
[369] Yeah, it's on Instagram, it's all memes.
[370] And then, you know, so we have the memes and then we have the text.
[371] On Instagram, you can't really share.
[372] So it's amazing that they got the kind of engagement that they did, even without the sharing function.
[373] One of the things you can do is if you know the names of the accounts and a lot of them are out there publicly now, you can actually see them in regram apps.
[374] So people were regramming the content.
[375] So Facebook says about 20 million people, excuse me, engaged with the Instagram content.
[376] But what isn't included in that is all of the regrams of the content that were shared by other accounts.
[377] So the spread and the dispersion of this, It's an interesting thing to try to quantify because we have engagement data, but we don't know did it change hearts and minds.
[378] We don't know if it influenced people to go follow other accounts.
[379] We don't know if it influenced people to not vote.
[380] There's just so much more, I think, still to understand about how these operations work.
[381] Well, we can kind of, we can assume that it had some impact, right?
[382] I mean, as you were saying earlier, when a new person enters into a conversation that, it changes the tone of it.
[383] How much of what they did was their own original post and how much of it was commenting on other people's posts?
[384] So I thought you were actually going to ask a different thing there.
[385] Oh, please.
[386] what did you think I was going to ask?
[387] How much of it was them repurposing our own posts, right?
[388] Repurposing real American content.
[389] Did they do that as well?
[390] Yeah, tons of times.
[391] But let me, let me, so they created a lot of their own stuff, particularly in the early days.
[392] And so you can actually read the dataset.
[393] And one of the things, when we started finding these posts, I was struck by how sometimes it read like ESL and then sometimes it read like perfect, flawless, professional English, and then other times it read like normal English, vernacular, just the way that we would talk to each other.
[394] And I started digging into what that was.
[395] So when it was vernacular English, when it was, when it read like fluent American, American English, it was usually cribbed from somewhere else.
[396] So they would go and they find a local news story from some obscure local paper and they would crib and then they would paste that and then so the Facebook post would be that cribbed sentence from that article and then their meme and maybe they would add a sentence underneath it to give it some kind of context or angle when they would write their own stuff you would see the sloppiness that's where you could see subject verb agreements not quite there the you know ways in which like Russian possessives are different than American possessives, the slips there.
[397] And then the other thing was the really funny stuff, which was, you know, a post that's supposedly written by Texas secessionist, right?
[398] So you can probably have an image of a Texas secessionist in your mind as I say this.
[399] And it would be things like Hillary Clinton is a terrible individual.
[400] And as a terrible individual, it's completely impossible for us to back her and her candidacy for the American presidency.
[401] say, furthermore, you know what I'm going to?
[402] It's like, furthermore, ergo.
[403] It is clear of that.
[404] And I'm like, it reads like, remember you're like in, you know, English in college or something.
[405] You've got to, like, write a formal essay.
[406] I was like, okay, come on.
[407] You bullshit your way through it.
[408] Right.
[409] So nobody actually talks like this, especially not, you know, your stereotypical, Texas secessionist.
[410] So it was funny seeing these incongruities.
[411] And that's, unfortunately, one of the best ways to tell what you're dealing with is actually to kind of look for those incongruities now.
[412] and see, uh, as you read communications online, like can, you know, does this, does this read like an American?
[413] Does this read like a communication?
[414] And what we started to see was one way to not get caught for your lousy English or your, you know, your cultural, um, lack of, uh, kind of native abilities, is to just repurpose other people's stuff.
[415] And so that's where you would see memes getting shared from, on both the right and the left, you know, you'd see a lot of these like Turning Point USA memes that they were repurposing and pushing out, or you would see occupied Democrats or the other 98%.
[416] So memes from real American pages, real American culture, and they would just sometimes slap a new logo on and just repost it as if it was theirs.
[417] So it does, in those instances, read just like, you know, authentic American content.
[418] And in many ways, it is authentic American content.
[419] How many people are working for this agency?
[420] Do we understand?
[421] I don't remember.
[422] Off the top of my head, it was somewhere between a couple hundred and a thousand, I think.
[423] I don't know if it's bigger than that now.
[424] And they're just constantly...
[425] They just moved offices.
[426] And this is a funny story.
[427] I guess they moved offices.
[428] And then people started calling in bomb threats to the office.
[429] office and it was just like every day a new bomb threat would get called in so they couldn't work basically.
[430] I assume this is like some American intelligence agency just like fucking with them but so there's people calling in these bomb threats to try to keep them from working and I think there was an article that came out really recently that said that like army cyber one of our one of our agencies worked to just like take them offline during the during the midterms a couple days around the midterms.
[431] I wonder if whoever's calling it in is doing it in bad Russian.
[432] Yeah.
[433] That'd be so ironic.
[434] So that was really funny, like moved to this nice new office building and someone chucked a Molotov cocktail through the window at some point.
[435] Of course.
[436] Yeah, of course.
[437] It's spy versus spy.
[438] It's, it, I mean, it only makes sense that in this bizarre and unpredictable and really unprecedented environment that we find ourselves in, that something like this would come up and just sort of throw a monkey wrench into the gears of real conversation.
[439] online.
[440] I mean, it's, it's a really amazing time in that we're getting to see this kind of stuff happen in real time.
[441] We're getting to see these, these sort of weird attempts at manipulating things.
[442] And I think in a lot of ways successful, especially with less sophisticated people that don't really understand that they're being trolled and that someone is fucking with them.
[443] And there's, it seems, I mean, I've, there's a bunch of accounts that I have bookmarked that I follow, but I don't follow.
[444] So I don't follow them online because I don't want them to know I'm following them, but I just go to them.
[445] And some of them are so strange.
[446] A few of them are flat earth accounts.
[447] This is something that I'm finding.
[448] Yeah, yeah, yeah, the conspiracy theorists.
[449] Yes.
[450] And some of them literally have almost no, it's all memes, and they don't say much, if anything, underneath the memes.
[451] And I go to it, I'm like, what exactly are they doing here?
[452] Like, what exactly they're doing here?
[453] they trying to do with these?
[454] Because they just, they're very weird.
[455] There was one that I came across.
[456] I was looking at the, uh, the, the conversation around GMOs.
[457] And because we have seen, one of the things that Russia does besides the social bots and the, you know, the American, you know, screwing with like Americans directly, um, is the House, so this was a Republican House Committee, House Science and Technology Committee, about a year ago said that they were seeing evidence of, both kind of overt propaganda and then ways of disseminating the propaganda.
[458] So there's always the dissemination and then the accounts and then the content.
[459] So it's like you look at three different things to try to get a handle on whether or not this is real or fake.
[460] So when we talk about the accounts, we're looking at are they real people?
[461] Are they, you know, automated or are they not automated?
[462] When you're looking at the content, you're usually looking at the domains.
[463] And that's kind of the last piece because you don't want to have any kind of bias.
[464] get in there, but you're just trying to see, is it being pushed through like overt Russian propaganda domains, like their think tanks and things?
[465] And then the third is the dissemination pattern.
[466] Is it being pushed out through automated accounts?
[467] Is it spreading in ways that look anomalous versus how normal information would spread?
[468] So one of the things that the House committee looked at was using that kind of rubric, Russian, you know, these dubious pieces of content and narratives around American strategic industries.
[469] So the energy industry, oil and fracking, for example, or the, you see a lot of stuff with GMOs and agriculture.
[470] You know, this very, his narrative of, you know, Putin and Russia being the land of organic plenty in the United States serving its people, toxic, poisoned vegetables, this sort of stuff.
[471] And meanwhile, at the same time, there's competition for who's going to get the, you know, large contract to provide rice to some part of the world.
[472] So there's like an economic motivation underlying this kind of narrative.
[473] And I was looking at one of these accounts and it was tweeting an article about Hillary Clinton, a vote for Hillary is a vote for Monsanto.
[474] But it was tweeting this in, you know, it was like three months ago or something.
[475] It was like mid -2018 or late 2018 when I was looking at this.
[476] I'm like, well, that ship sailed a long time ago, guys.
[477] Why are we tweeting about Hillary's votes?
[478] It's because they're just, they're just there to, it was written by a Russian think tank.
[479] And so they're, they just have these automated accounts retweeting, repurposing this content from forever ago.
[480] And it doesn't even make sense.
[481] It's just out there to amplify a particular point of view or bump up, uh, mentions of a site.
[482] What were they trying to do with the anti -vaccine posts?
[483] Yeah, that was a, that was an interesting thing.
[484] So I would say not much, to be honest.
[485] Um, there were 900 maybe 800 I think tweets about vaccines in the content and so Facebook and Twitter you have this sorry Facebook and Instagram you have this building up of tribes Twitter you have instead they're just talking about whatever's popular right they're talking there's shit posting they're talking about whatever's current and new, whatever scandal has just broken anywhere.
[486] So Twitter is less about establishing relationships and more about joining the conversation and nudging it.
[487] And so most of the vaccine -related posts, it was not a big theme for them.
[488] It wasn't something that was on like Facebook and Instagram where that's where they're really leaning in.
[489] Like, this is what we want Americans to think about.
[490] So no mention of vaccines on those platforms, not on YouTube.
[491] On Twitter, you see it in 2015.
[492] Funny enough, during the Disneyland measles outbreak, most of the most of the, Much like there's a whole lot of conversation around vaccines right now because of the outbreaks in Washington and New York.
[493] Back in 2015, you saw the same thing.
[494] Lots of conversations about measles because of the Disneyland thing that happened down here.
[495] And so they're in there and they're saying, vaccinate your kids, don't vaccinate your kids.
[496] They had a couple of conspiracy theorist accounts.
[497] I am trying to remember the name.
[498] It looked like a blonde woman.
[499] I think its name was Amy.
[500] and Amy was a conspiracy theorists And Amy was a fake person Amy was a fake person Yeah, I wish I could remember Yeah, it was a Twitter account God, what the hell was her name?
[501] She was Amy Black There are certain of their personas Actually got a lot of lift There was one called woke Louisa That was a black woman There was, yeah, I mean, they nail it, right?
[502] They're not dumb There was 10 GOP The fake Tennessee GOP page How much autonomy do you think these people have that are creating these things?
[503] I mean, are they creative?
[504] It sounds like some of them are actually pretty funny.
[505] Yeah, they are funny.
[506] They are funny.
[507] That's why they work.
[508] Everybody thinks it's just like, you know, incompetent shit.
[509] It's not.
[510] It's actually really good.
[511] That's where that, that's, I think, the thing that, you know, even with whatever, you know, political proclivities you may have, I think you can at least recognize humor, or even if it's laughing at your side.
[512] And I will say that some of the stuff, especially targeting the right wing, you know, the right wing like youth kind of pages were, they were funny.
[513] They really were.
[514] And it was, I think that is, people assume that like they're too smart to fall for it.
[515] It's just those liberals or it's just those conservatives or, you know, it's really, it targets everybody.
[516] And they understand the psychology, the motivation, the narratives and the culture and they produce the content accordingly and I imagine that they had a grand old time doing it because there was some stuff in two narratives that came out in 2017 the first was when Facebook started to moderate their pages they started to scream about how Facebook was censoring them so the exact same narrative that you see today about how any moderation is censorship it's a picture of like Zuckerberg and it's like like nice page you've got be a shame if anything happened to it you know and that's the meme that they're putting out there when they're complaining that their fake page got taken down there was tons and tons of these memes also about the Russians did it mocking the idea that the Russians did it so this is as the story is beginning to come out before we've had the tech hearings before we've had the Mueller indictments before we've had the investigation you see these memes where it's like, oh, my spedometer was broken, it must have been the Russians, or picture of Hillary Clinton, and it's like in a little golden book kind of thing.
[517] And it's like the whiny child's guide to blaming Russia for your failures.
[518] You know, and it's, again, it's funny.
[519] Like, the stuff is funny.
[520] And they're like metatrolling.
[521] And you imagine them sitting there.
[522] Like, you know, they have a picture of like some like, you know, buff guy carrying a gun.
[523] And they're like, I'm not a Russian troll man. I'm an American.
[524] Wow.
[525] Okay.
[526] So you're looking at this and you're like, it's just so.
[527] oh, spot on.
[528] And again, I can't see what the people commented under it if they were like right on or if they were like, ah, this is bullshit.
[529] But so that's where you get at the, they, you know, people think like, oh, I'm too smart to fall for it or, oh, this is targeting those other people.
[530] No, it isn't.
[531] That's the problem.
[532] It's just, it's going to target you with the thing that you're most likely to be receptive to just because of psychological bias and tribal affiliation.
[533] And you're not sitting there thinking, how, is this person who is purportedly just like me screwing with me?
[534] And that's why it does manage to attract a following and get retweeted, get reshared.
[535] It's good.
[536] Well, it's so clever because it's so comprehensive.
[537] There's so much involved in the fact that they're willing to do this for years and years before they really sort of activate the political aspect of what they're trying to do.
[538] It also, it's strange that they're so sophisticated about our culture because we don't know a goddamn thing about Russia.
[539] The average person knows Putin bad, evil warlord, Crimea, he invade.
[540] You know, like, we have like a four -year -olds understanding.
[541] Like, if you had a, if you just grabbed a person, a random person in the street, college educated person, and asked them to describe, what's so bad about Russia?
[542] Wow, it's like communist over there or something, man. I mean, they fucking, they hate us.
[543] First of all, they have bombs.
[544] Like, there's so little understanding of their culture.
[545] But yet, they know so much about ours.
[546] That's one of the weird things about being an American.
[547] When you go, when you travel overseas and you realize how much they know about our elections, how much they know.
[548] We don't even know who the fuck is running their country.
[549] We don't have any idea.
[550] But they know about Trump and they know that Hillary did this and they know that Bernie wants to give the money away.
[551] And you know, it's crazy.
[552] It's weird.
[553] And these people must have like a deep education on American culture, American politics.
[554] Do you think they're training these kids?
[555] Yeah, yeah, absolutely.
[556] So they did a couple of things that came out in the Mueller indictment.
[557] First of all, a couple of people actually came here and did a road trip around America.
[558] Oh, wow, just to learn.
[559] Went to Texas.
[560] Yeah, yeah, basically.
[561] A little Texas.
[562] Tell me where you keep the beef jerky.
[563] But that's, that was in the one of the, I think the Mueller indictment of, from February 2018.
[564] There have been three kind of big documents that have come out to from Eastern District and one from Mueller on how it all worked out.
[565] I think another misconception is this notion of $100 ,000 in ads.
[566] They spent $18 million in 2017, I believe, was the stat that came out during another one of the Mueller indictments.
[567] So they're not just, you know, the money is not just going for the salaries and the ad buys.
[568] The money is also going for, they were talking about using kind of consultants.
[569] And this is where you get at, this thing that comes out during the stand -up where they're like black people who are LGBT don't want to see white LGBT memes.
[570] And this degree of granularity, the degree of sophistication, but then also what you see them doing is engaging one -on -one.
[571] And that's where it crosses the line from social media operation to this is much more like spying.
[572] Do you watch the Americans?
[573] No, I didn't.
[574] Oh, I love that show.
[575] You should definitely watch it.
[576] It's great.
[577] It's great.
[578] There's too many things to watch.
[579] Totally.
[580] But what's interesting is the It does paint a pretty interesting picture of like this couple under what's, you know, deep cover that are engaging with and pretending to be Americans and forming relationships.
[581] And apparently it's based loosely on real people.
[582] Yeah, that's what I've read also.
[583] But what you see in the Mueller indictment is the text messages, is the messenger, the Facebook messenger messages, where they're going back and forth with real activists.
[584] And they're saying things like, you know, hey, my ad account got shut down.
[585] can you run some ads for me?
[586] Or the, hey, I want to help your protest.
[587] We're fellow Black Lives Matter activists, and we see you're running a protest up, and I think it was like Ithaca or something, Binghamton.
[588] How can we help you?
[589] We can give you some money for posters, and they're sending money for posters.
[590] Or they reach out to a Trump supporter, and they say, like, we think it'd be really funny to have a Hillary Clinton impersonator sitting in a truck, flatbed truck, that's made up to look like a jail.
[591] Let us, you know, if we give you some money, will you find the Hillary Clinton impersonator and put her in jail and do this Hillary for prison thing?
[592] And so this is where, another thing that they did was using Facebook events to create real -world protests.
[593] So they're not limiting it to shit -posting online and making people feel attention online.
[594] They're actually sending people out into the real world to have in -street violence.
[595] And so one of the things that they did was they coordinated a Facebook event, one for the Texas secessionist page and one for the, there was a pro -Muslim called United Muslims.
[596] And on the same day, at the same time in Texas, they have a rally to support Texas and resist the Islamistization of Texas across the street from a rally to defend Muslim culture.
[597] And so they, like, there's literally no, you know, they just create these Facebook events on these pages and then they promote them with ad dollars and other things.
[598] And you literally, if you go and you look at the Texas reporting from that day, I don't remember if it was dozens or hundreds, but a sufficient number of people showed up that they had literally on opposite sides of barricades, police officers in the center, screaming at each other because one group is there for the resist the Islamization of Texas and the other group is there to like defend Muslim culture.
[599] And so you get two, you know, two opposite sides of the spectrum in the same place at the same time.
[600] And you literally incite like a mini riot.
[601] So there were about 81 of these events where they were holding Black Lives Matter style rallies for victims of violence, police violence, memorials for people who were killed by police officers, things that real Americans would do, but this wasn't being done by real Americans.
[602] And that's the insidious thing, right?
[603] How does Facebook detect that?
[604] How to you know, how to you when you see this come to defend Texas culture and you're a, you know, diehard, proud Texan, you know, you're not thinking like somebody in St. Petersburg is organizing this.
[605] And that I mean, I think that the idea that this was just some memes is just not, it doesn't respect the significance of what they were trying to do and how effective that they were.
[606] with these other things, or even if they're just trying it out just a little bit, just working to see what works, they're always experimenting, they're always trying to find ways to create that tension.
[607] And that's the thing that I think is so interesting about this, right?
[608] This evolving, this idea of an information war where these tactics evolve, and you are really at a disadvantage when it comes to actually detecting them.
[609] Yeah, and on the outside, if you're looking at that, you'd say, well, okay, what is their objective?
[610] Why would they have this Texas secessionist page across or rally across the street from a pro -Islam rally.
[611] Why would they do that?
[612] You know, if you're on the outside, you think about the amount of effort that's involved in doing something like this.
[613] And they're also doing this with no leaders, right?
[614] There's no one there that's running it when they get there.
[615] So all the pro -Texas people go, here we are.
[616] Look it over there.
[617] It's a motherfucking enemy.
[618] I think a couple times there were comments on some of the like archived pages and things where you could see the screenshots of people being like dude you hell had us all come out there and like nobody showed up right who was in charge you know they're probably throwing a lot of things against the wall hoping that they stick when something sticks well you see this on the there was a page called black matters and black matters was interesting because they went and made a whole website so they made a website blackmatters us .com which I think is still active it's dormant.
[619] They're not updating it, but I believe you can go and read it.
[620] And it was designed to call attention to police brutality type things.
[621] And so they had this Black Matters US page.
[622] And then there's the Black Matters Facebook page, the Twitter account, the Instagram page, the YouTube channel, the SoundCloud podcast, the Tumblr, the Facebook stickers.
[623] They had Facebook stickers that looked like little black panthers, like little cats.
[624] Yeah, little black cats.
[625] They were actually really cute, very well done.
[626] So you have this entire fake media ecosystem that they've just created out of whole cloth, all theirs.
[627] And then what they start to do is they start to put up job ads.
[628] And so it's come be a designer for us, come be a, come right for us, come photograph our protests, come, you know, they have like a kind of like black guy dressed in a cool outfit like hipster, you know, holding a sign like join black matters.
[629] You see them go through a couple different logos the same way you would if you were starting a, you know, media brand.
[630] They start posting ads for, do you want to be a calendar, girl, send us your photos.
[631] Do you want to be on a black reality TV show, send us video clips of you?
[632] Do you, like, they begin to do real work to ingratiate themselves with the community.
[633] They had a physical fitness thing.
[634] It was called Blackfist, and the idea was that it was kind of vaguely militant.
[635] desk in that it was supposed to teach black people how to handle themselves at protests.
[636] Should there be police violence, how to fight back?
[637] And they actually went and found a guy, a physical fitness, you know, a martial arts guy, and they were paying him via PayPal.
[638] So he was running classes for the black community under this black fist brand.
[639] And they would like text him or call him.
[640] He played some of the voicemails on TV.
[641] Actually, I heard them.
[642] after my report came out, I think they tracked him down.
[643] And he just talks about how they, yeah, they just PayPal them, you know, a couple hundred bucks every time you ran a fitness class.
[644] What were the voicemails like?
[645] It was, um, hello, we are fellow black men concerned about police.
[646] They, you'd be, you'd be surprised they actually had a YouTube channel with, um, two black men named Williams and Calvin.
[647] And there was this channel, Williams and Calvin.
[648] And there were actual black men?
[649] Actual, yeah, yeah, actual black men.
[650] So they hired these gentlemen?
[651] I hired these guys to be a fake YouTube channel.
[652] And it was a, it was called a word of truth, I think was with the name of it.
[653] And so Williams and Calvin, these two guys, would give their word of truth.
[654] And their word of truth was usually about, you know, how fucked up America is, which, I mean, there are very real grievances underlying all of this.
[655] And that's the problem, right?
[656] They have things to exploit.
[657] Were they writing these things for these gentlemen?
[658] I imagine.
[659] I mean, imagine they were.
[660] But they were definitely paying them, and they organized the channel.
[661] Seems likely.
[662] The channel's organized, yeah.
[663] So these guys...
[664] Well, that particular one, I think that they were actually...
[665] They were in on it.
[666] They knew what they were doing.
[667] Oh, really?
[668] They knew they were working for the Russians?
[669] One of the guys who was in that channel popped up again in 2018, right before the midterms.
[670] Like, maybe even the day before.
[671] I'm trying to remember the timeline here.
[672] And he made a different video saying he wanted to...
[673] This was amazing.
[674] Saying he wanted to leave the Internet Research Agency kind of.
[675] So he was saying, basically, I'm tired of doing this work, I want to do a leak, I want to show you all of the things that the Internet Research Agency has done.
[676] And so they actually put out this, so this guy who had been in the Williams and Calvin video, so people recognized his face in the 2018 midterms goes and says he wants to leave and he's going to leak all this information.
[677] and sorry the damn cough um and he wants to to like confess i don't remember all the specifics because it was right before my thing came out and i was so busy working but um but yeah he pops up again and he's saying he like wants to expose the truth and i think most people didn't cover it didn't pay attention youtube shut down the channel and deleted the video immediately um why did they do that I think that it was seen as another influence operation.
[678] You know, you don't trust the...
[679] So even him saying that he's going to expose it was probably just another level.
[680] Well, what wound up coming out, this is so convoluted.
[681] I'm sorry, I know it's like hard to explain without visuals.
[682] What they wound up doing was they did drop a bunch of docs.
[683] So they did release a pile of documents in which they claimed they actually hacked the Mueller investigation and Mueller had nothing.
[684] And so this is, again, another kind of convoluted piece of this where they do release information.
[685] And so in this particular example, they release information that we believe they actually got through legal discovery.
[686] So the documents that the investigation provided to one of the indicted Russians were the documents that they then leaked claiming they had hacked the Mueller investigation.
[687] So they're constantly doing these things to generate press, generate attention, create just that.
[688] degree of people don't know what's real, or they read the headlines that are then released by the more propagandist, overt Russian propaganda, and they think that that is the true story, that that is, that the Russians hacked, the Mueller investigation.
[689] So there's always this, how do we create fear, uncertainty, and doubt?
[690] How do we throw people off?
[691] How do we come up with these extremely convoluted spy games that leave people feeling unbalanced, that make people wonder what they can trust, who they can trust, and what's real.
[692] And even as somebody who looks at this stuff, you know, day in and day out for years, I do still regularly get surprised by the, by the sheer kind of ballziness and ingenuity of, you know, some of the stuff as it comes to light.
[693] Well, it's really fascinating that they went so far as to hire people to make a fake account on YouTube and hired these black guys to pretend that they're doing that on their own and they're really being hired by the Russians and then when the guys leave you don't know if they really did leave yeah you don't know if this is just more bullshit it's like in like you were saying earlier if they get you and you buy into it hook line and sink or they win if they get you to think well how much else is bullshit they still win because you're looking at everything with sort of this tainted lens now.
[694] Everything seemed, and in a sense, that's probably the ultimate goal is to disrupt our social media environment and to sort of hijack the natural conversations that are taking place.
[695] Yeah, and I think it's, I mean, it's effective.
[696] There's certain, you know, I was in Estonia last year, and they've been targeted by this stuff for decades now.
[697] You know, they've a 25 % Russian -speaking population, and most of the news that they get is from Russian media, right, you know, right on the border.
[698] They talk a lot about the extreme commitment to educating their citizens to make them realize that this kind of thing does happen.
[699] This is what it usually looks like.
[700] Don't share it.
[701] You know, just ignore it.
[702] Let it go by.
[703] And I don't think we are quite there yet.
[704] I think that there's still plenty of people in the country who don't believe it happened.
[705] or for some reason are completely incapable of separating the Russian social media influence campaign happened from it means Donald Trump's election is illegitimate or it means Donald Trump colluded, right?
[706] Those are very different statements.
[707] You don't have to collude in order for someone somewhere to unsolicited, go and support your candidacy.
[708] So you can believe two things simultaneously.
[709] One, that Trump did not collude and that his election is, perfectly legitimate and that this had no impact and two that it still happened and that i think is um i am consistently amazed when i read my social media mentions it at how hard that ability to hold those two ideas is for for people they just believe that if they're supporters of trump they absolutely cannot acknowledge that this operation took place and i or if they are passionate supporters of the far left, it's more of like an equivocation, you know, well, we don't really know if they did it.
[710] Well, the U .S. does bad things too.
[711] Well, how do we know?
[712] So that's where it plays out very differently depending on which part of the political spectrum you sit on.
[713] Well, it falls right into the issue that we have with cognitive dissonance.
[714] If we believe in someone or if we want someone to win, especially if it's our team or our person or on our side, you know, like I, I, I saw a lot of this when Donna Brazil released her book detailing how the DNC sort of rigged the primaries against Bernie Sanders and for Clinton.
[715] There were so many people that were Clinton supporters that just didn't want to believe it.
[716] I was like, well, why wouldn't you believe this woman?
[717] Like you believed her before when she was supporting Clinton.
[718] And then when she leaves, now you won't believe her.
[719] It's because it's inconvenient.
[720] And we're real weird in our binary view.
[721] We want things to be good or bad.
[722] one or zero.
[723] This is it.
[724] And this is a super complex issue.
[725] It seems like they've been doing this for a long time and they've gotten really sophisticated at it.
[726] And I think there's a lot of people that have been sucked into it that have no idea that it's actually influenced the way they've formed their own opinions.
[727] This is where it gets really strange.
[728] People are so malleable and they're so easily manipulated.
[729] Many people are.
[730] That something like this, like a real good solid concentrated effort to try to target these groups that have these very specific interests and really dig in and form roots and then go out.
[731] I mean, it's so sophisticated.
[732] Their approach is, on one hand, horrified and the other hand, deeply impressed.
[733] Yep.
[734] Me too.
[735] Yeah.
[736] Now, was this freaking you out when you had to, like, go over all these memes and you were actually laughing at them?
[737] And you're like, God damn it.
[738] Well, you know, there's that tweet that goes around every now and then you don't have to hand it to them.
[739] And I'm always like, how do I properly convey a recognition for the, you know, I don't think we do ourselves any favors by pretending it all sucked and didn't matter and they're incompetent?
[740] Right.
[741] I think that you have to acknowledge that you have a sophisticated adversary that is very capable, that is very determined, that is constantly evolving.
[742] and to treat that with the degree of respect it deserves.
[743] I think that that's just common sense, actually.
[744] I read media on both sides of the aisle, and I feel I try to stay current, actually, on what memes are percolating in lots of different spaces, in part just because I am always curious about what's organic versus what seems to be disproportionately amplified or what new communities are popping up.
[745] I just think it's, I think the spread of information among people is just a very interesting, you know, it's, it's something that interests me a lot.
[746] I think crowd psychology is really interesting.
[747] I think ways that crowd psychology has transformed as the internet has kind of come into being, particularly with things like the mass consolidation, the ease with which we can target people.
[748] You know, we didn't even really talk about that.
[749] But the one of the things with, there's always, you know, even in the decentralized internet, there's always been propaganda.
[750] there's always been crazy conspiracy theories, all this stuff, but it's that you can reach the people who are likely to be receptive to it now.
[751] And as people self -select into tribes, particularly in this country right now, one of the things that's remarkable is the way in which once you've self -selected into that tribe, and this is the media in your ecosystem, and you share it with your friends and Facebook ensures that the people who see it are the people who are most likely to be receptive to it, or if you run the ad targeting, you directly, you know, send it into the feeds of people most likely to be receptive to it.
[752] We have this interesting phenomenon where consolidation targeting and then these gameable algorithms mean that it's just this kind of information goes way farther, way faster than it ever could in the past, regardless of whether it's Russia pushing it or Iran, as we've seen a network of Iranian pages went down recently.
[753] We see this globally now.
[754] We see countries targeting their own people with it.
[755] And it's just, this is the information ecosystem.
[756] This is like the new infrastructure for speech.
[757] And it, sorry, privileges this kind of sensationalist content.
[758] Yeah, do you have them?
[759] It would be great.
[760] Just, just remember to have them.
[761] Cold's going around.
[762] Don't feel bad.
[763] I'll get you one.
[764] Hold on a second.
[765] Who's seen all this stuff?
[766] Is this stuff?
[767] Has, obviously Facebook has, has.
[768] check this out.
[769] I'm sure Twitter's aware.
[770] What has a reaction been?
[771] And is there any sort of a concerted effort to mitigate some of the impact that these sites have?
[772] Yeah, lots of it actually.
[773] So I think in 2017 was when we started, like, we being independent researchers, I guess, people on the outside of the company's academics, began.
[774] to find the content, you know, really began to investigate of journalists would identify the name of a page and then me and people like me would go and we would scour the internet looking for evidence of what was on that page.
[775] So I found a bunch of the stuff on Pinterest, for example, wrote about it.
[776] Guy by the name of Jonathan Albright found a crowd tangle data cache.
[777] And with that we got the names of a bunch more pages, bunch more posts, and we had some really interesting stuff to work with.
[778] Originally, the platforms were very resistant to the idea that this had happened.
[779] And so as a result of that, they were in, you know, there was a, the first thing that Zuck said in 2016, when, you know, Trump gets elected, Twitter, it goes crazy that night with people who work at Twitter saying, oh, my God, were we responsible for this, which is a very Silicon Valley thing to say.
[780] But what I think they meant by that was their platform had been implicated as hosting Russian bots and fake news and harassment mobs and a number of other things.
[781] And there was always the sense that it didn't have an impact and it didn't matter.
[782] And so this was the first time that they started to ask the question, did it matter?
[783] And then Zuck made that statement.
[784] Fake news is a very small percentage of whatever on Facebook, the amount of information on Facebook.
[785] And the idea that it could have swung an election was ludicrous.
[786] So you have the platforms kind of the leaders of the platforms digging in and saying it's inconceivable that this could have happened.
[787] And as the research and the discovery begins to take place over the next nine months or so, you get to when the tech hearings happen.
[788] So I worked with a guy by the name of Tristan Harris.
[789] He's the one who introduced me to Sam.
[790] And he and I started going to D .C. with a third fellow Roger McNamee and saying, hey, there's so much, there's this body of evidence that's coming out here.
[791] And we need to have a hearing.
[792] We need to have Congress ask the tech companies to account for what happened, to tell the American people what happened.
[793] Because what we're seeing here as outside researchers, what investigative journalists are, are writing, the things that we're finding just don't line up with the statements that nothing happened and this was all no big deal.
[794] And so we start asking for these hearings and actually myself and a couple of others then begin asking them in the course of these hearings, can you get them to give you the data?
[795] Because the platforms hadn't given the data.
[796] So it was that lobbying by concerned citizens and journalists and researchers saying we have to have some accountability here.
[797] We have to have the platforms account for what happened.
[798] They have to tell people because this had become such a politically divisive issue.
[799] Did it even happen?
[800] And we felt like having them actually sit there in front of Congress and account for it would be the first step towards moving forward in a way, but also towards changing the minds of the public and making them realize that what happened on social platforms matters.
[801] And And it was really interesting to be part of that as it played out because one of the things that Senator Blumenthal, one of the senators did, was actually said Facebook and Twitter have to notify people who engage with this content.
[802] And so there was this idea that if you are engaging with propaganda's content, you should have the right to know.
[803] And so they started to push messages, Twitter sent out these emails to all these people.
[804] people saying you engaged with this Russian troll and Facebook created a little field, a little page that told people if they had liked or followed a troll page.
[805] So it was really trying to get at making the platforms accountable.
[806] But they did it outside the platform, through email, huh?
[807] Which is interesting because I would never read an email that Twitter sends me, right?
[808] You're like, this has just got to be nonsense.
[809] I didn't get one, so I maybe.
[810] I guess.
[811] I guess.
[812] I just got lucky, but...
[813] I might have had a multiple day back and forth with some Russian control.
[814] But that was, I think, one of the first steps towards saying, like, how do we make the platforms accountable?
[815] Because the idea that platforms should be accountable was not a thing that everybody agreed on in 2015 when we were having this conversation about ISIS.
[816] And that's where there's the through line here, which is, and it does connect into some of the speech issues too, which is what kind of monitoring.
[817] in moderation, do you want the platforms to do?
[818] And when we were having this conversation about ISIS, there was a not insignificant collection of voices that were really concerned that if we moderated ISIS trolls on Twitter, not the beheading videos, there was sort of universal agreement that the beheading videos should come down.
[819] But if we took out what we're called the ISIS fanboys, which were like 30, 40 ,000 accounts at their peak, that we would, yeah, there's a document called the ISIS Twitter census for anyone who wants to actually see the research done on understanding the Twitter network in 2015.
[820] There was a sense that like one man's terrorist was another man's freedom fighter.
[821] And if we took down ISIS fanboys, were we stifling their freedom of speech, freedom of expression and like, goodness, what would come next?
[822] And that, when you look at that, that fundamental swing that has happened now in 2018, 2019, Where there's that same narrative because originally no moderation was taking place.
[823] And then now there's a feeling that it's kind of swung too far in the other direction.
[824] But the original conversations were really, how do we make Twitter take responsibility for this?
[825] And legally, they aren't responsible for it, right?
[826] They are legally indemnified against the, they're not responsible for any of the content on their platforms.
[827] None of the platforms are.
[828] There's a law called Communications Decency Act Section 230, and that says that they're not responsible.
[829] They have the right to moderate, but not the obligation to moderate because they are indemnified from responsibility.
[830] So the question becomes now that we know that these platforms are used for these kinds of harms and they are used for this kind of interference, where is that balance?
[831] What do we want them responsible for monitoring and moderating?
[832] and how do we how do we recognize that that is occasionally going to lead to incorrect attributions people losing accounts and things like that so yeah they're in a weird conundrum right now where they don't they're trying to keep everything safe and they want to encourage people to communicate on the platform so they want to keep people from harassing folks but because of that they've also they've got these algorithms and they they tend to miss very often like this whole learn to code fiasco where people are getting banned for life for saying learn to code which is about as preposterous as it gets I think the learn to code fiasco is going to be the tipping point where a lot of people in the in the future when they look back on when did the heavy -handedness become overreach learn to code because I mean Jesus Christ I mean that if you can't say learn to code I mean I look at my mentions I mean on any given day, especially like yesterday, I had a vaccine proponent.
[833] Yeah, I watched it.
[834] Peter Hottes.
[835] Yeah, Peter's great doctor.
[836] And, you know, and it seemed like what was really disturbing to me was like the vast majority of the comments were about vaccines and so few about these unchecked diseases that are running rampant in poor communities, which was the most disturbing aspect of the conversation to me, that there's diseases that rob you of your intellectual capacity that are extremely common that as many as 10 % of people in these poor neighborhoods have almost no discussion it was all just insults and and you know you fucking chill and this and that you know it's like my mentions are going to be interesting but oh they're going to be a disaster today i know i know um well let me let me it's i think that one of the challenges for the platforms is a lot of things start out like learn to code.
[837] I remember, you know, I watched that play out.
[838] Covington Catholic was another thing that, I mean, God.
[839] With Learn to Code, there was some of the people who were trolling and just saying learned to code and, you know, whatever, you don't have a right to not be offended.
[840] But then there was the other accounts that kind of took it that step further and began to throw in, like, the ovens and the other stuff that was Learned to Code, right?
[841] And that's one of the challenges with the platform, which is if you're trying to assess the just the content itself like if you start doing keyword bands you're going to catch a lot of shit that you don't want to catch but the flip side is if you you know the this is the challenge of moderating at scale which is where you know what what side do you come down on do you come down on saying like 75 % of people with hashtag learn to code or just you know um not doing anything incredibly offensive and then the 25 % who are they really change the tone of the overall campaign and the hashtag for the entire community and that's where you see Twitter I think come in with the more heavy -handed and just shut it down kind of thing I don't I don't know that there's an easy answer I think that we are you know even today what was the latest kerfuffle Elizabeth Warren got an ad taken down on Facebook and then there was a whole conversation about was Facebook censoring Elizabeth Warren.
[842] I personally didn't think that it read like censorship.
[843] What was the ad about?
[844] It was an ad about, funny enough, her platform to break up Facebook.
[845] Wow.
[846] So Facebook took that down?
[847] Like, yeah, listen, Hooker.
[848] It sort of read more like a cell phone.
[849] She had a picture of Facebook's logo in the image and that violates the ads terms of service.
[850] And the reason behind that is actually because Facebook doesn't want, people putting up ads that have the Facebook logo in it because that's how you scam people, right?
[851] That's a great way to rip people often.
[852] So probably just like an automated, you know, an automated take down, like an automated, like it halts the ad.
[853] You have to go and make some changes and then you can push the ad back out again.
[854] But it just happens at a time when there's like so little assumption of good faith and so little assumption of such extreme anger and polarization and, you know, assumption that the platforms are.
[855] or censoring with every little kind of moderation snafu that it makes it, I think, I don't know how we have the conversation in a way that's healthy and looks towards solutions as opposed to the left screaming that it censored, the right screaming that it censored, the platforms trying to get around how do we both moderate and not moderate, which is a tough position to be in.
[856] I think, yeah, I don't have any good.
[857] No one does.
[858] That's part of the issue.
[859] And Vigja discussed that pretty much in depth, what she was saying.
[860] This is about moderating in scale when you're talking about millions and millions and millions of posts and a couple thousand people working for the organization.
[861] And then algorithms and computer learning that's trying to keep up.
[862] And that's where things like learn to code.
[863] And people are so outraged and pissed off because when they do get banned, they feel like they've been targeted.
[864] Well, you really just ran into some code.
[865] and then it's really hard to get someone to pay attention to your appeal because there's not enough people that are looking at these appeals and there's probably millions of appeals every day it's almost impossible yeah and there's you know depending on which um which side you're on you also hear like um this person is harassing me and i'm demanding moderation and nobody's doing anything about it yes uh so it's it's definitely i think gotten worse um Um, it's, it's interesting to look back at 2016 and wonder how much of the, um, where we are now is in part because not a whole lot happened in 2016.
[866] In 2016, it was, or 2015 in particular, very light, like almost no moderation, just kind of let it all hang out there.
[867] And I, I look at it, um, I look at it now, particularly as it evolves into this conversation about free speech, public squares.
[868] Um, and what the new kind of infrastructure.
[869] Um, um, and what the new kind of infrastructure.
[870] structure for speech, what rights we should expect on it, it's a really tough, you know, I think some of it is almost like the people who are, who hear the words free speech and they just assume that it's people asking for carte blanche right to harass and saying, you know, how do we balance that, I think Jack and Vijaya were saying this on your show, how do we maximize the number of people who are involved, make sure that all voices do get heard without being unnecessarily heavy -handed and moderating a thought or content and instead moderate behavior.
[871] And instead moderate particular types of signatures of things that are inauthentic or things that are coordinated.
[872] And looking at, this again gets to disinformation too, rather than trying to police disinformation by looking at content, really looking instead at actions and behavior and account authenticity and dissemination patterns.
[873] Because a lot of the worst trolls and stuff are just using these throwaway accounts and then they disappear.
[874] Well, I have the impression myself that when we're talking about censorship, we're talking about moderating content that really we're talking about this current era and that what's coming is essentially we're like putting up a small like twig fence and a herd of stampeding buffaloes on the way in terms of the more invasive or the more the more potent levels of technology that are on the way I just feel like I feel like we're everything is moving in a very specific direction and that very specific direction is less boundaries between people and information and that includes communication and it's going to be insanely impossible.
[875] It's insanely difficult or nearly impossible to moderate in 10 years.
[876] I just don't think that's going to be, I don't think it's going to be in the wheelhouse.
[877] I just, I think it's going to, we're entering into some weird place where we're either going to have to stay off of social media because it's just too toxic or grow a thick skin and just be able to deal with anything.
[878] And then if that's the case, how are we going to be able to differentiate between things that are particularly designed?
[879] to manipulate us, specifically designed to manipulate us and change our opinions by foreign entities like this, you know, this Russian troll farm.
[880] I do think the, you know, when I think about like, so we believe that disinformation is in part facilitated by gameable algorithms, consolidation, and then the targeting, the kind of things we've talked about through this conversation, then I think that the algorithm piece, the manipulatable algorithms, that's really squarely the responsibility of the platforms.
[881] I don't think that there's any regulation or, you know, any kind of framework that's going to come from Congress that's going to address that.
[882] Well, that's pretty clear from the Facebook hearings, right?
[883] I mean, they barely understood the difference between an Android phone and an iPhone.
[884] They really don't know what's going on.
[885] Tim Apple.
[886] Yeah.
[887] That's the king of the world.
[888] Right.
[889] He says he did that on purpose, which is even more hilarious.
[890] Just say you fucked up, man I mean, to say that you knew Tim Cook his real name wasn't Tim Apple It is funny called him Tim Apple though I did I appreciated the rest of like every CEO in Silicon Valley changing their Twitter channel afterwards Yeah, that was funny But I think the I just left my train of thought on that too Sorry We're talking about You were talking about them Oh yeah regulating Gaming algorithms Regulating Yeah So I think that ultimately the algorithmic piece does remain squarely in the purview of the platforms.
[891] And that's because it's an arms race, right?
[892] As they change their algorithm a little bit, tweak it for the product function, which they just do in their role as business.
[893] There is no regulation that's going to come down fast enough to catch that.
[894] I think actually finance is an interesting parallel here.
[895] Because like in the financial markets, they're kind of these multi -tiered levels of regulation and oversight so that there's always some entity responsible, whether it's the exchange or self -regulatory organization or the government and the SEC looking to see if like information integrity in the markets is being maintained right there's no shitty algorithm coming in to manipulate people it's just making sure that we have that level of trust so I think that right now the tech ecosystem is lacking regulation in all of its forms so that will likely change but the argument for decentralization is I don't know how you execute it I the antitrust thing in particular as it comes up so much more now.
[896] Excuse me, I don't know under what economic grounds you make that claim that's way outside of my wheelhouse in my area, but there is something to be said for this, you know, return to decentralization in some way.
[897] Yeah.
[898] I feel like it lets people have what they want.
[899] And it lets you, you know, Reddit's a great example.
[900] You have these, it's almost like federalism.
[901] You have this central platform, but then you have these little, communities under it and each community has its own norms each community has its own rules nobody who violates the moderator rules in a Reddit and scream censorship is really taken seriously right that you this is the rules of the community you're in the community there you go and this was how um in the olden days of the internet like us nets and things you would have this is the community that you've chosen to be a part of if you don't like the moderation standards you go to this other community right I think the concern with consolidation is that people who do get moderated away, feel like there's nowhere for them to go, that they've lost access to the entire world.
[902] So I think if you have that decentralization, in some ways, it, it stops being quite so much of a freedom of speech issue.
[903] If you can't speak on, you know, if you, if everything is like, if there's 50 different platforms and you fall foul of some sort of norms or stand or community membership in this one you can go over here to this other one right then the idea that somebody has moderated you away or deplatformed you or something is much less um potent maybe if they're alternatives though if you dock someone or something along those yeah they're still like again that federalism thing it's like the little um moderation at the lower levels versus um kind of top level like you're summarily booted off yeah that seems like the best approach right like it seems like the best approach is to sort of let these communities sort of establish themselves.
[904] But even inside those communities, then you have people that gain power through moderation and they sort of abusing it and then it becomes some sort of a weird hierarchy.
[905] It's like decentralization in general is probably the right move for all this stuff.
[906] But how does that happen with something like Facebook or Twitter without government intervention?
[907] And, you know, this is one of the things that Tim Poole was bringing up.
[908] Like, if you guys don't do this, if you don't handle this, it's entirely possible.
[909] Somewhere down the line, you're going to be regulated by the government.
[910] Do you really want that?
[911] I think that that's an inevitability.
[912] Whether they want to.
[913] Yeah, at this point, yeah.
[914] What do you think it's going to happen?
[915] Well, you know, honestly, I say that, and then I think back to the reality, which is in this Congress, with this executive.
[916] I don't know how we get any regulation through.
[917] I think we've seen some examples like the honest ads act which was introduced right before the first tech hearing if I'm remembering the timeline correctly so that would have been like late 2017 and what they said was it's you can no longer have like a free -for -all with ads on social platforms where nobody knows who paid for it or where it's coming from or anything like that and and what they you know that so senator clobuchar and senator warner and i think senator mccain also was uh was part of this um create this law saying that the platforms have to follow the rules that tv and radio already follow and this is an example of recognizing the role that that you know these are no longer startups that can't be you know that can't meet these obligations it used to be that Facebook was exempt from these from these disclosure requirements because their ads used to be those, remember those tiny postage stamp size things that were on the right side of the page?
[918] So they had a finding it was, they were given the same exemption that campaigns get for like skywriting and postage and pencils where it's like literally the form factor of the content makes it such that you can't put the ad disclaimer on there.
[919] And it used to be that all of the advertising on Facebook was regulated using that same finding that that that these postage stamp size things are too small to put the the disclosures on and then of course as we know that evolved into the ads being looking much like an organic post and so now they do have these little things that pop up where you can see why you got targeted and what it is I think that that's an example of like the credible threat of regulation and the public opinion moving the platform to take an action that it wouldn't have necessarily done on its own so it's not regulation but it's a nudge through public opinion and the credible threat of future regulation.
[920] We've seen California go after the platforms also recently.
[921] There was that California GDPR thing from last year.
[922] California state legislature saying we're going to pass a privacy requirement.
[923] And they did it.
[924] They got it done.
[925] What is the privacy requirement?
[926] Oh, boy.
[927] I feel like I'm probably not the best person to explain this because I don't know the specifics.
[928] But the GDPR was the law in the UK and in Europe that protects the data.
[929] It creates particular protections.
[930] Like you have to re -opt -in for targeting.
[931] There are certain kinds of targeting that they can't do certain types of data that you can request.
[932] They delete.
[933] So this is a provision that took effect in Europe last year.
[934] We don't have that same law here in the U .S. where we don't have the same data protections as the Europeans.
[935] And so California GDPR was the California state government, the state Senate legislature, passing a law that basically mimicked a lot of the provisions of what the Europeans were given under GDPR.
[936] But what that did was it created a law that applied to the people of California.
[937] And so Facebook and Twitter and the others don't want to be in a position of having to have this, you know, kind of Balkanization or a legal requirement.
[938] And so they, in turn, have now, I believe, gone to Congress suggesting that we're going to need to have a federal solution that applies to all of the U .S. So a federal level privacy regulation because they don't want to have to adhere to the privacy regulations of, like, each individual U .S. state.
[939] One solution that's been tossed up was that people would have to somehow another confirm their identity.
[940] that instead of it being an anonymous post, that it would have to say Renee DeResta.
[941] Like, I know who you are.
[942] You have to have a photograph, have some sort of a government ID, shows that it's you, that we would somehow or another minimize trolling, minimize disinformation.
[943] If your account was connected to a social security number or whatever it was.
[944] The problem, of course, is that these damn things get hacked all the time.
[945] And if your Twitter account gets hacked, now they have your social security.
[946] number, they have your address, they have your information that you use to sign up.
[947] And, you know, there's a lot of, unless there's some way, and there isn't, to absolutely lock down all that data and make sure that it's inaccessible to some sort of third party, that doesn't seem like a likely course of action either.
[948] And I think the question of identity, I think most social science research that I've read has suggested that that's not necessarily the be -all and end -all.
[949] I think it depends on, per your point, what you're trying to do.
[950] I'm thinking right now of the, there is this request for FTC comments for net neutrality, and I know that a lot of them were, you know, whoever left them, there are a lot, tons, I think millions, actually, of these fake comments that were left on the net neutrality call for public comment, where they were scraping, like the you know those those crappy data brokers those horrible things where like your name and your address is up there and no matter how hard you try you can't get it down so scraping those to grab names and addresses and then email addresses and leaving leaving comments pretending to be those people it's it's hard nobody most people don't want to enter their social security number into some form or validation right a lot of people will point to things like well you know America has a strong commitment to anonymous speech.
[951] So there's that cultural thing.
[952] Well, I think, you know, people will point to like, yeah, federalist papers and police and so on and so forth.
[953] And whistleblowers.
[954] And whistleblowers, yeah.
[955] I think I've seen, I remember when Facebook did make it a requirement to, you have to validate your actual name and address.
[956] They send a postcard to your house if you want to run political ads.
[957] And then I remember people complaining that people who do.
[958] didn't have, you know, that this was going to, oh, was it, God, this was during the DACA arguments, so that, yeah, during the, this was during some of the illegal immigration debates, right, right as this was happening, people began complaining that immigration activists who were undocumented would not be able to run Facebook ads because they didn't have, you know, identification to verify with.
[959] so no matter what people put out there's going to be somebody who has a complaint about it so we're in this this like um this gridlock everybody recognizes that the situation sucks and that social media is a disaster on a myriad number of fronts and there's not much in the way of plausible solutions.
[960] I think for disinformation in particular, just to stay in my wheelhouse, we're trying to push towards multi -stakeholderism, which is just to say can we create the back channel channels of communication that have come up for election integrity and things over the last few years, last year and a half, can we standardize that in some way?
[961] can we create an oversight body, maybe the FTC, that is at least responsible for having some oversight to make sure the platforms are doing enough.
[962] But this is, I think, this is going to be the theme of 2019.
[963] Does it go the antitrust route?
[964] Does it go the privacy route?
[965] Like, does it do a kind of hybrid combination of multiple, you know, tackling multiple problems at once?
[966] I'm really curious to see how we shake this out because it just seems like no, you know, even agreeing on what the problem is we're not quite there yet yeah and you're you are seeing calls like particularly from elizabeth warren for breaking up a lot of these larger institutions not just but not just even social media but even amazon she's talking about breaking up a lot of these bigger companies um it's the problem with that is like to what and make them what and then what happens and then you know one of that what if one of those things that you broke up, that becomes Twitter, becomes more popular than the other Twitter, and it has much more attendance than what do you do?
[967] Yeah, you should get one of those people on to just talk about that, because Lena Connor or somebody who really Matt Stoller knows this space in and out, and I just don't.
[968] The, I, you know, personally, what I feel like the, a lot of people are moving into smaller communities, a lot of people are moving into groups or moving into WhatsApp chats.
[969] They're recognizing that the system as it is right now has this toxicity and are withdrawing a bit.
[970] I don't know if you've seen this in your friends or community, but.
[971] Well, I have it.
[972] Jamie and I were actually talking about it yesterday in terms of the use of Twitter.
[973] The use of Twitter has dropped.
[974] And one thing that I notice is that my follower numbers doesn't move very much.
[975] on Twitter, as opposed to like Instagram.
[976] I don't really use Facebook, but Instagram, there's a giant difference in how many followers I get per day on either platform.
[977] And it seems to me that the people that are using Twitter, they've kind of like locked in, they've found their little communities, and it's mostly toxic.
[978] I mean, I mean, I'm sure I'm generalizing.
[979] I am for sure.
[980] It's probably not even 10 % toxic, but it seems toxic.
[981] You know, when you look at those kind of comments and any time something happens, it seems like the reaction to it is very rarely is it some sort of objective rational discourse.
[982] It's most likely just insults and, you know, swears.
[983] It's weird.
[984] It's just people are communicating in a way online that if they communicated in real life, there would be blood in the streets.
[985] I think about that a lot, actually.
[986] Yeah, I do.
[987] I think that the, what are the, you know, especially I was listening to your thing with Tim and Jack and Vagia.
[988] The idea of the public square.
[989] And I use this metaphor too, like when I write about them, the privatized public square.
[990] But then I think about it sometimes.
[991] I'm like, we've never had a national public square.
[992] There's no such thing in the history of America as a national public square.
[993] There are regional public squares or towns.
[994] squares, state squares, you know, where there is, again, this kind of federalism.
[995] People who have self -selected to live in a particular community, there's norms in that community.
[996] But if I were to go up to you in a public square and start screaming in your face or, you know, being an asshole and trying to get a whole mob together to go after you, like probably somebody would intervene, either a bystander or the police.
[997] And we have notions of like nuisance and things like that.
[998] We have notions of like, there's more of an intuitive sense of the balance between speech, which is to be protected, and then the kind of fighting words and that sort of thing.
[999] There's no clear lines on that.
[1000] There's not much in the way of norms.
[1001] And when you're online, there is nobody who's going to come and step in and intervene in a way that would play out in real life.
[1002] So the, I think that we just haven't quite ported those norms of basic good behavior in the real world into this massive roiling crowd that's just sort of always on at all times and that's sort of more where we are on social media.
[1003] Well, I think there's actually a carryover to real life from social media that you can find in these protests at universities when conservative speakers come and then Antifa wants to shut them down and then you have people like the proud boys fight with Antifa.
[1004] I don't remember that before ever.
[1005] I think this is a byproduct of the type of communication that's the norm on social media.
[1006] I really think that's what's happening here.
[1007] I really think that instead of social media mimicking real life, real life is starting to mimic the kind of interactions that people have on social media and with violent repercussions.
[1008] It's certainly, if not violent, very aggressive and angry in a way that go back before social media.
[1009] When was the last, I mean, in 2000 and the 90s.
[1010] 1990s, how often were there these incredibly volatile protests at universities where you have conservatives and liberals screaming at each other?
[1011] And you have these people that are being deplatformed and they won't let them speak at these colleges.
[1012] And then they're gathering up this online mob to try to bolster support.
[1013] And then people come to meet them.
[1014] We're going to stop them at all costs.
[1015] And it's kind of, it's kind of flavoring real life.
[1016] versus real life being represented in social media.
[1017] Yeah, I don't remember it from when I was in college either.
[1018] No, it didn't exist.
[1019] Not the expert on college protests and the, you know, maybe people would point to the 60s or something.
[1020] You got to go back to Kent State.
[1021] That's what I was going to say.
[1022] You got to go back to war protests.
[1023] But they were protesting something very specific, an unjust war that nobody wanted to be a part of.
[1024] This is a weird time.
[1025] Yeah, it is.
[1026] it um i know it feels unstable yes yes that's the best way but it's also awesome I'm enjoying the shit out of it I mean it's well because I I love the fact that it for the first time in my life it does not seem like the government has a fucking handle on how people are behaving and thinking at all like they don't know what's going on it's like you've got people that are you know people that are trying out socialist tropes and socialist ideas for the first time in the mainstream and they're getting a big ground swell support behind it and you have a lot of people that are like pro -nationalist and pro -America for the first time in a long time and that's getting a lot of support there's more discourse now even if it's toxic and I think a lot of it isn't I think like I said if 10 % is toxic it seems like it's all toxic if one out of ten people calls you a piece of shit you're like, oh, I got to get out of this fucking forum, right?
[1027] I mean, that's really how it feels.
[1028] I think that's true.
[1029] I think that's true.
[1030] There's definitely, I've noticed that too.
[1031] The actual numbers of horrible trolls are maybe this afternoon will be different.
[1032] I don't know.
[1033] They'll try to prove you wrong.
[1034] But, you know, I'm on Twitter a lot, and it's the platform I use more than any other.
[1035] So for all the complaints, I do feel like, there's some real value there for me personally and and I I like I like the serendipity of of the unexpected being pushed into my feed occasionally sometimes it's you know sometimes of course I get angry or you know feel annoyed or think why the hell you know why why this but I think ultimately the there is a lot of value to the platform I think where unfortunately, I really do believe that so much of the polarization and the conversation around speech is people who got burned during the laissez -faire days of 2015, 2016, in mentally linking up the idea of free speech with the idea of being harassed online.
[1036] And I think when you look at, this is purely an opinion, I've absolutely no data to bear this out, but when you look at those Pew studies that show, um, that younger people are more likely to want, you know, in safe spaces or less, you know, less offensive opinions.
[1037] I do sometimes wonder if that's an effect of coming of age at a point when, you know, random assholes were screaming at you on social platforms 24 -7 versus, like, I didn't have that experience growing up.
[1038] I didn't have that experience until I was, like, 25, you know?
[1039] So maybe there is something, per your point about that, the almost, there is not much of a difference as, you know, people spend so much time online.
[1040] This is where you're having your social engagements.
[1041] This is where you're having your conversation.
[1042] So it does shape the way people think about, you know, their experience of what it means to have a conversation and what it means to speak freely.
[1043] I think that's one of the interesting, you know, it doesn't help that the, you know, it doesn't help that the, But free speech has sometimes, in many cases, become a figly for, I want carte blanche to, you know, say all kinds of mean shit to people all day long with no consequences.
[1044] Yeah, they think they should be able to do that because that falls under the blanket of free speech.
[1045] There's also an issue, I think, with young kids today that have smartphone addictions, people today, I should say, forget young kids.
[1046] How about me?
[1047] Humans, smartphone addictions.
[1048] Yeah, me too.
[1049] And when you're online, and when you think about the majority of your interactions with human beings, there's a lot of folks that are on their phone eight hours in a day.
[1050] That's probably me. But think about those interactions.
[1051] I know.
[1052] That's a shocking number of interactions with people that you're not even in direct physical, the presence of.
[1053] You're not looking at them.
[1054] You're not waiting for them to talk.
[1055] You're not considering what they're saying.
[1056] You're not reading social cues.
[1057] All the things that make us human.
[1058] Those are all thrown out the window.
[1059] You're just looking at text, and it might be coming from Russia.
[1060] The text that you're getting upset at in responding to, I mean, per your research, really, there's a direct possibility.
[1061] It's not even a fucking person or a person, but not really representing their actual thoughts, just trying to push your buttons.
[1062] Yeah, no, that's true.
[1063] It's not even Russian trolls.
[1064] It's just the amount of, I think it really is, we're in the unique time where it's hard to know who you're engaging with.
[1065] It's hard to gauge whether it's good faith.
[1066] I mean, I react sometimes where I'll see a response and go click into the person's feed to try to decide if I should take this as a seriously good faith inquiry or if it's like a, like a, you know, kind of vaguely cloaked, fuck you, you know.
[1067] Right, right, right.
[1068] Right.
[1069] You don't have to do that when you're in person.
[1070] It's a very different, very different experience.
[1071] Yeah, I have a friend of mine who's a young single male, and he was going through his direct messages, and these girls were sending him all these naked photos and videos.
[1072] And he's like, look at this, man. I go, let me see your phone.
[1073] And I said, let me click on that link.
[1074] I go, she has one picture on her page, you fucking dummy.
[1075] This is probably not even a person.
[1076] Like, who knows what?
[1077] This is someone from Nigeria is trying to get your credit card information.
[1078] Yeah, catfishing.
[1079] Yeah, I mean, they're trying to get you.
[1080] And he's like, oh, yeah.
[1081] Like, how do you not go to her page?
[1082] Because I just, you know, I thought it was a girl sending me naked pictures.
[1083] You know, they just do that sometimes.
[1084] Well, they definitely do do that sometimes.
[1085] I mean, people are weird.
[1086] They do all kinds of weird things.
[1087] But, you know, there's a lot of these fake accounts.
[1088] And I don't know what they're trying to do.
[1089] They're trying to get money from people or that does come up.
[1090] I've seen every now and then, if you follow reporters on Twitter, particular ones who have like open DMs they'll periodically post the insane catfishing you know stuff that they get where it's like um all about money or yes i don't have i don't know i don't know how they do it just like yeah they're glad i'm not a journalist yeah yeah i don't get it so overall how are you happy that you got into all this does this does this change your perceptions of online communication um I feel like there's, I have tried over the years, whether it's conspiracy theorist communities or terrorists or, you know, Russia, Iran, the state -sponsored actors, the domestic ideologues, I have tried to always say, like, here is the specific kind of forensic analysis of this particular operation.
[1091] and then here is what we can maybe take from it and make changes.
[1092] We've seen some of that begin to take shape, and so I feel grateful to have had the opportunity to work towards connecting those dots and work towards having this conversation, meaning helping people understand what's going on.
[1093] I think I am most concerned about the, as this gets increasingly easy to do through things like chatbots, you know, now there's these, you've seen the website, this person does not exist .com.
[1094] No. So it's, there's a technology called, it's a machine learning technique generative adversarial networks, and they, it's these, basically they're in this particular application working to create pictures of people, faces of people.
[1095] And so this website is, when you go to it, it pulls up a, yeah, there you go.
[1096] So this person does not actually exist.
[1097] That's a fake human?
[1098] Yeah.
[1099] And so these are all.
[1100] Whoa.
[1101] These are computer generated images?
[1102] Yep.
[1103] Oh, God.
[1104] So you can see created by a Gans, it says it down at the bottom there.
[1105] So these are not real people.
[1106] And so we have increasingly sophisticated chat technology.
[1107] We have increasingly sophisticated, like, you're not going to detect that image somewhere else.
[1108] That old trick of like right click and look and see if you're talking to someone with a stock photo.
[1109] That goes right out the way.
[1110] window is stuff like this gets easier and easier to do.
[1111] Well, then deep faking, right?
[1112] Yeah, the deep fakes on the video front.
[1113] I think that it does change.
[1114] I think we haven't quite adapted to what is it like to live in a world where so much of the internet is fake.
[1115] And I do think, per your point about identity, that there will be groups of people that self -select into communities where identity is mandatory, you know, where this is who you are and you have some sort of verification versus people who choose to live in the world of you know, drink from the fire hose, take it all in and try to filter it out yourself.
[1116] So we look at these evolving technologies and I don't necessarily feel, you know, particularly optimistic in the short term.
[1117] I think that ultimately it does, like we change as a society to a large extent in response to this.
[1118] We think about, you know, there are going to be some fixes that the platforms are going to be able to undertake, they're going to be, we're going to get better at detecting this stuff.
[1119] Maybe, you know, the adversary will evolve.
[1120] Hopefully we get better at detecting it as it evolves.
[1121] But it's, I think we fundamentally, ultimately change.
[1122] Like, people become more aware that this is a thing.
[1123] They are more skeptical.
[1124] That does change our ways of interacting with each other.
[1125] But I feel like that is going to be the direction that this goes.
[1126] There's, the more, like, you know, the thing that keeps me up at night would be more the ease of turning this from a social media problem into like a real world war problem, meaning, as an example, back in 2014, one of the first things the Internet Research Agency did, September 11th, 2014, they created a hoax saying that ISIS had attacked a chemical plant down in Louisiana.
[1127] It's called the Columbia chemical plant hoax is I think there's a Wikipedia article about it now.
[1128] But what happened was they created a collection of websites.
[1129] They created fake CNN mockups, Twitter accounts, text messages that went to local people, radio station Collins, you name it, everything to create the impression that a chemical factory had just exploded in Louisiana and there was some attribution to ISIS.
[1130] And this was done on September 11th.
[1131] So this is the kind of thing where this actually did go viral.
[1132] Like, I remember this happening, not as a social media researcher.
[1133] I just remember it actually being pushed into my social media feed.
[1134] So you have these, and we didn't know that it was the internet research agency for a year and a half after.
[1135] But this is the kind of thing where you look at parts of the world that aren't the U .S., like the recent drama between India and Pakistan, and you can see how these kinds of things can go horribly, horribly wrong if the wrong person is convinced that something has happened or if there's a you know or if this leads to a riot or if this leads to real world action i think that's um one of the main fears as this gets better and better the video fakes get better the people fakes get better what you know what do you do then so yeah what do you do when you see those images those images those images fake images those are stunning they're so good i mean it just makes you wonder I mean, and we're going to get to a point where if someone's not in front of you talking, you're going to look at a video, you're not going to have any idea.
[1136] You know, they're doing those deep fakes with famous actresses' faces, and they put him in porn films.
[1137] And it's stunningly good.
[1138] I mean, it's amazing.
[1139] And they're also, with someone like me, I'm fucking doomed because there's thousands of hours of me talking.
[1140] I've said everything.
[1141] Yeah.
[1142] So you could take this new, there's this new programs that are editing audio.
[1143] and you could splice together audio and video now.
[1144] They don't even splice it.
[1145] The computer will generate it.
[1146] Yeah.
[1147] It's insane.
[1148] It generates your lip movements, everything.
[1149] I mean, it's really stunning.
[1150] It's really stunning, and it's only going to get crazier and crazier.
[1151] And it's going to be very difficult to, unless something is actually happening right in front of your face, it's going to be very difficult to be able to differentiate.
[1152] And then I'm worried about augmented reality and virtual reality, and this stuff making its way into it.
[1153] I mean, we're going to dive willingly.
[1154] with a big smile on her face into the Matrix.
[1155] Have you watched that recently?
[1156] No, I haven't.
[1157] I watched that on a plane like a month ago or something.
[1158] It holds up so well.
[1159] No, it holds up.
[1160] Really?
[1161] It holds up.
[1162] Like, it's insane, except for the, um, except for the phone booths.
[1163] It's the one thing where, like, there's no phone booths anymore.
[1164] But everything else is, um, wow.
[1165] Yeah, it's, it's, I don't know, you, you guys you've seen it recently.
[1166] I just rew, I just bought it on 4K because like, I forgot, you know, in 99 there wasn't barely HD back then.
[1167] And I just wanted to see what it was like.
[1168] rewatched it, forgot how good it was, and like two, two and a half hours flew by.
[1169] The whole movie just went out.
[1170] Yeah, it's, it's amazing how that seemed preposterous in 99 or whatever it was.
[1171] Like, oh, this is just sci -fi.
[1172] And now you're like, hey, this is a little closer.
[1173] Like, the idea of, have you messed around at all with like HTC Vi or Oculus?
[1174] I tried.
[1175] I was a VC briefly.
[1176] like back in five, six years ago now.
[1177] And I tried one from, I think it's USC, right?
[1178] The Southern California is a bunch of really good labs down here.
[1179] And I tried one where it was like a zombie holodeck simulator.
[1180] And it was, it wasn't just the VR.
[1181] It was also, it was immersive, so they had a backpack on me. And it was actually scary as hell.
[1182] I was like, this is really good.
[1183] This is, like, I love first person shooters.
[1184] I think they're so much fun.
[1185] But this was just the first time where, in the in the game you have like a bat or something and you're like trying to beat zombies together with a bat and they're like all up in your face and um i don't know if that thing ever came to market but dan was it good to well they have a company called it's called the void now and they have this uh wreck at ralph one and i did it with my kids recently and you put on a haptic feedback vest and you go through this environment and it's great i mean you're it's very clear that you're in a video game it doesn't seem real But it is so much better than anything that existed five years ago.
[1186] And you go, okay, well, what is with the exponential increase in power of technology?
[1187] What is this going to be like in 10 years?
[1188] What's it going to be like in 15?
[1189] It's going to be impossible to differentiate because now it's a vest.
[1190] It's just a vest.
[1191] You're not strapped into a chair.
[1192] You can move around.
[1193] So you're going through this whole warehouse that they have set up for these games.
[1194] You know, you've even picking up physical objects and they look different in your hands than they do when you look at them without the head gear on.
[1195] Yeah, there's like centers for that.
[1196] Like, I know in Vegas they have them.
[1197] I haven't been to one here again.
[1198] They have one outside of Disneyland.
[1199] It's in Disney, downtown Disney.
[1200] It's called The Void.
[1201] And then you go into these, there's one Star Wars one.
[1202] There's a wreck at Ralph one.
[1203] Now they have them in malls, small ones.
[1204] You sit in these little eggs and you go on roller coaster rides and you fight off zombies and go into a haunted house.
[1205] It's getting weird.
[1206] And it's just, we're, you know, I was a kid when, video games were these ridiculous Atari things where you stuck a cartridge in and you're playing pong and now we're looking at something that you're looking at these images of people you see pores you see there you see the glistening of their lips you see their eyes it's it's very strange it's very strange to think that those are not really people and that we are probably going to look at all you have to do is create something that is so large and and the propaganda the so terrifying that it causes you to act without double checking, triple checking, and making sure you verify the fact that something has really happened.
[1207] And then it sets into motion some physical act in the real world that you can't pull back.
[1208] Like, and this is just not related necessarily, but what happened with Hawaii when they got that false warning.
[1209] Oh, my gosh.
[1210] That nuclear missiles were headed their way.
[1211] Yeah, that was a, can you imagine.
[1212] It was crazy.
[1213] And it was just someone hit the wrong button.
[1214] I mean, if we come to some sort of a point in time where someone does something like that on purpose and shows you video that you really think New York City just got nuked and, you know, you have to head to the hills and there's a giant traffic jam on the highway and people start shooting each other.
[1215] I mean, if Russia really wants to fuck with us, what they're doing now with just this, you know, this IRA agency and, all of these different trolls that they've got set up that's sort of trying to get people to be in conflict with each other.
[1216] This is with primitive crude text and memes.
[1217] Yep.
[1218] What could be done in the future?
[1219] It's terrifying.
[1220] We live in a weird world.
[1221] Yeah.
[1222] We're in agreement.
[1223] Yep.
[1224] Let's end it there.
[1225] All right.
[1226] Thank you so much.
[1227] I really appreciate it.
[1228] Thanks for coming down.
[1229] Thanks for having me. And thank you for all your work and exposing all this stuff.
[1230] It's really, really interesting and terrifying.
[1231] Tell people how they can get a hold of you so they can troll you on Twitter.
[1232] My handle's at No Upside.
[1233] No upside.
[1234] And you don't necessarily have it on Instagram?
[1235] It's more like family.
[1236] My Instagram's, my kids.
[1237] All right.
[1238] Thank you, Renee.
[1239] I really appreciate it.
[1240] Thank you.
[1241] Bye, everybody.