The Daily XX
[0] From the New York Times, I'm Michael Barrow.
[1] This is The Daily.
[2] Today, a secretive company promising the next generation of facial recognition software, has compiled a database of images far bigger than anything ever constructed by the U .S. government.
[3] The Daily's Annie Brown speaks to reporter Kashmir Hill about whether the technology is a breakthrough for law enforcement, or the end of privacy, as we know it.
[4] It's Monday, February 10.
[5] Kashmir, how did this story come to you?
[6] So I got an email.
[7] It was a Wednesday morning.
[8] I was checking my phone.
[9] And it was from a tipster who had gotten a bunch of documents from police departments.
[10] And one of the police departments had sent along this memo about a private company that was offering a radical new tool.
[11] to solve crimes using facial recognition.
[12] And what would make a facial recognition tool radical?
[13] So law enforcement has, for years, had access to facial recognition tools.
[14] But what this company was offering was unlike any other facial recognition tools that police have been using because they had scraped the open web of public photos from Facebook, from Venmo, from Twitter, from education sites, employment sites, and had a massive database of billions of photos.
[15] So the pitch is that you can take a picture of a criminal suspect, put their face into this app, and identify them in seconds.
[16] And when you read this memo, what do you make of what this company is offering?
[17] So I've been covering privacy for 10 years, and I know that a technology like this in publicans is the nightmare scenario.
[18] This has been a tool that was too taboo for Silicon Valley Giants who are capable of building it.
[19] Google in 2011 said that they could release a tool like this, but it was the one technology they were holding back because it could be used in a very bad way.
[20] And why exactly is this kind of technology, this line in the sand that no one will cross?
[21] What makes it so dangerous?
[22] So imagine this technology in public hands.
[23] It would mean that if you were at a bar and someone saw you and was interested in you, they could take your photo, run your face to the app, and then it pulls up all these photos of you from the internet.
[24] It probably takes them back to your Facebook page.
[25] So now they know your name.
[26] They know who you're friends with.
[27] They can Google your name.
[28] They can see where you live, where you work, maybe how much money you make.
[29] Let's say you're a parent and you're walking down the street with your three -year -old.
[30] Somebody can take a photo of you and know.
[31] where the two of you live.
[32] Imagine you're a protester in the U .S. or in a more authoritarian regime.
[33] All of a sudden, they know everything about you, and you can face repercussions for just trying to exercise your political opinions.
[34] You know, if this app were made publicly available, it would be the end of being anonymous in public.
[35] You would have to assume anyone can know who you are anytime they're able to take a photo of your face.
[36] And so that technology is what this company is pitching these police departments.
[37] Exactly.
[38] And what do you know about this company at this point?
[39] So at this point, all I really know is that the company is called Clearview AI.
[40] And so the first thing I do is Google it, and I find their website, which is clearview .a .I. And the website is pretty bare, but there's also an office address listed there, 145 West 41st Street, which happens to be just a couple of blocks from the New York Times office.
[41] Right.
[42] So I decided to, you know, walk over there, and there just is no 145 West 41st Street.
[43] So that was weird.
[44] So now I have this company that's offering this radical new tool.
[45] It's got a fake address.
[46] It's got a fake address, which is a, you know, a huge red flag.
[47] So what do you do next?
[48] I found the company on LinkedIn.
[49] It only had one employee listed.
[50] a sales manager named John Good, which...
[51] John Good.
[52] John Good.
[53] It seemed like it could also be fake, and I sent that person a LinkedIn message and never heard back.
[54] So one of the things I find online is a website called Pitchbook that lists investments in startups.
[55] And so it says that this Clearview AI has received a $7 million from a venture capital firm and from Peter Thiel, You know, a big name in Silicon Valley invested in Facebook and Palantir.
[56] So I reach out to his spokesperson and he says, I'll get back to you.
[57] I never hear from him again.
[58] And then one day I open up Facebook and I have a message from a friend whose name I don't recognize.
[59] And he says, hey, I hear you're looking into Clearview AI.
[60] You know, I know them.
[61] They're a great company.
[62] How can I help?
[63] And you don't know who this guy is.
[64] I don't.
[65] I mean, it's a guy I met once, 10 years.
[66] ago and somehow he knows that I'm looking into this company.
[67] But I'll take it.
[68] You know, finally someone wants to talk to me about Clearview AI.
[69] And so I say, hey, can I give you a call?
[70] And then he doesn't respond, which I'm getting used to.
[71] You just can't catch a break.
[72] I know.
[73] I'm like, I cannot believe this is another dead end.
[74] So phone and email are not working for me. So I just need to figure out another door to knock on to try to talk to a real human being.
[75] And one of the investors in the company is this venture capital firm that has an office in Bronxville, New York.
[76] So on a cold, rainy Tuesday, I got on the train and headed to Bronxville.
[77] I get to the company's address.
[78] It's just like in a retail space and go inside.
[79] There's this long, quiet hallway of office suites.
[80] And this venture capital firm is at the very end.
[81] and I knock on the door, and there's no one there.
[82] So I start trying to talk to their neighbors, and a woman who works next door says, oh, yeah, they're never here.
[83] So I'm walking down the stairs to go back out of the building, and two guys walk through the door.
[84] They're both in dark suits with, like, lavender and pink shirts underneath, and they just kind of look like VCs to me. So I say, hey, are you with this venture capital firm?
[85] And they say, we are.
[86] Who are you?
[87] And I was like, I'm the New York Times reporter who's been trying to get in touch with you.
[88] And they said, you know, the company has told us not to talk to you.
[89] And I said, well, I've come all the way out to Bronxville.
[90] You know, can we just chat for a little bit?
[91] And they say, okay, it probably helps.
[92] I'm like very pregnant.
[93] And they offered me water.
[94] And they just start telling me everything.
[95] And what do they tell you?
[96] They confirm that they've invested in Clearview AI and that Peter Thiel has also invested.
[97] They identified the genius coder behind the company.
[98] This guy named Juan Ton Tat, and they say he's Vietnamese royalty, but he's from Australia.
[99] And they also tell me that Juan is the one that was using the fake name, John Good, on LinkedIn.
[100] He's John Good.
[101] He's John Good.
[102] and they confirm that law enforcement is already using the app, and that law enforcement loves it and that it's spreading like wildfire.
[103] Wow.
[104] So I've learned some stuff from these two investors, but no one from the company is talking to me still.
[105] So in the meantime, I'm also reaching out to law enforcement because I want to know if this app really works as well as the company claims.
[106] By this point, I had learned that over 600 law enforcement agencies had tried the app, including the Department of Homeland Security and the FBI.
[107] Wow.
[108] It's not just local police departments.
[109] This is being used by the federal government already.
[110] Yeah, I mean, I was just shocked to discover how easily government agencies can just try a new technology without apparently knowing much about the company that provides it.
[111] So I talked to a retired police chief from Indiana, who was actually one of the first departments to use the app.
[112] And they solved a case within 20 seconds, he said.
[113] A case they hadn't been able to solve.
[114] That they hadn't been able to solve.
[115] One of the officers told me that he went back through like 30 dead -end cases that hadn't had any hits on the government database.
[116] And he got a bunch of hits using the app.
[117] So they were really excited about it.
[118] This is way more effective than what they were using before.
[119] Exactly.
[120] With the government databases they were previously using, they had to have a photo that was just a direct, you know, full -face photo.
[121] of a suspect, like mugshots and driver's license photos.
[122] But with Clearview, it could be a person wearing glasses or a hat or part of their face was covered or they were in profile and the officers were still getting results on these photos.
[123] Wow.
[124] But the most astounding story I was told was that investigators had this child exploitation video and there was an adult who was visible in the video just for a few seconds in the background.
[125] So they had this person's face.
[126] They had run it through their usual databases and not gotten anything back.
[127] But then they ran his face through Clearviews app.
[128] And he turned up in the background of someone else's gym selfie.
[129] Like you could see his face in the mirror.
[130] And so they figured out what gym, this photo was taken out.
[131] They went to the gym.
[132] They asked the employees, do you know who this is?
[133] And the employee said, you know, we can't tell you.
[134] We have to protect our members' privacy.
[135] But then later, the detectives got a text from somebody who worked there, identifying the person.
[136] And that, I mean, that's just something that would not have been possible without Clearview's app.
[137] So because the officers were telling me the tool works so well, I wanted to see it for myself, on myself, and I asked them if they would run my photo through the app.
[138] Every time I did this, things would get weird.
[139] The officers would tell me that they ran my photo and there were no results.
[140] No pictures of you.
[141] There were no pictures of me, which was really weird because I have a lot of photos of myself online.
[142] And then officers would just stop responding to me or talking to me. And I had no idea what was going on until one officer was kind enough to explain to me. Hello, how are you?
[143] Hey, it's cashmere.
[144] Yeah, see.
[145] I'm keeping this officer anonymous because he could get in serious trouble for talking to me so openly about Clearview.
[146] If you could just describe yourself to the extent that you can describe yourself.
[147] I'm a police officer, a large metropolitan police department.
[148] So he's a cop who was doing a 30 -day free trial of the app, and he was really impressed with it.
[149] So I asked him if you wouldn't mind running my photo.
[150] And what did you tell you happened when he sent your picture through?
[151] Yeah, nothing.
[152] I didn't get a response at all.
[153] No results.
[154] No results.
[155] and within a couple of minutes of you putting your photo up there, maybe five, less than 10.
[156] I got a phone call from the Clearview company.
[157] They wanted to know why I was uploading a New York Times reporter's photo.
[158] That is so wild.
[159] I don't know.
[160] It creeps me out as a reporter.
[161] I mean, yeah, just...
[162] It kind of creeped me out as a user.
[163] this implied that Clearview flagged my face in their system such that they got an alert when a police officer ran my face, which I found, you know, very alarming because this is telling me for the first time that this company is able to monitor who law enforcement is looking for and not just know who they're looking for, but manipulate the results.
[164] And so then that made me go back to the earlier officers who had run my photo, and they all confirmed, yes, I got a call from the company, and they said, you know, we're not supposed to be talking to the media.
[165] So were you able to keep using the app after that?
[166] My account was deactivated.
[167] Did you ever get access back?
[168] I never did.
[169] But, you know, I have colleagues that have access.
[170] So if I were to need a picture searched, I could just email it to them and they can email me the results.
[171] And you think the trade -offs are worth it in terms of, you know, what the company has access to?
[172] Do I think it's worth it?
[173] So from a law enforcement perspective, it's worth it.
[174] You know, we get a lot of cases and we don't usually have a lot of leads.
[175] And so anything that can, honestly, anything that can help us solve a crime is a win for us.
[176] From a privacy perspective, it's rather frightening.
[177] the amount of information that they were able to get and provide.
[178] As long as they're doing it for the right reasons, everything will work out.
[179] Let's put it that way.
[180] But the problem is we don't know anything about the company at this point.
[181] We don't know if there's any kind of oversight.
[182] We don't know who the people are that are operating this and what their intentions are with their product.
[183] You know, the person in charge of the company won't talk to me. but then it's the end of December when I get a call from the company's spokeswoman and she says that the founder, Juan Tantat, is ready to talk.
[184] We'll be right back.
[185] Do you have a hard stop?
[186] No, I don't actually.
[187] 12 .30, a 12 noon.
[188] I have no hard stop.
[189] Oh, and I have lots of questions, so I'll take as much time as you can give me. So, Kashmir, you finally got an interview with the founder of Clearview.
[190] This man named Juan Tontat.
[191] Where do you meet him?
[192] So we met in a we work in Chelsea.
[193] He came down to the lobby.
[194] And his appearance surprised me because I had Googled him online and there are a lot of photos of him.
[195] And he's usually pretty eccentric, like a lot of Paisley shirts.
[196] He's a Burning Man. But in person, he was very conservative.
[197] He was in this dark blue navy suit with a white button up and leather shoes.
[198] So he looked very much like the security startup entrepreneur.
[199] He was looking at the part.
[200] He was looking at the part.
[201] When were you born?
[202] How old were you?
[203] 88.
[204] I'm 31.
[205] Okay.
[206] And what do you learn about him?
[207] So he is 31.
[208] He grew up in Australia.
[209] You can't hear that in his voice.
[210] I love computers.
[211] Yeah.
[212] So how did you get interested in technology?
[213] We had a computer, of course, when it was four or five years old.
[214] So his family got a computer when he was three or four, and he was always tinkering with computers growing up.
[215] I got the internet when I was 10, I think.
[216] And then you could discover all these things online.
[217] But Linux, I was like, I have to get this thing.
[218] It's the nerdiest thing ever.
[219] Convinced my dad, we installed it.
[220] And I spent whole summer reinstalling and learning Linux stuff, stay home from high school and learning programming for fun.
[221] So that's, I just really liked it.
[222] He enrolled in college, decided to drop out like many technologists do, and moved to San Francisco when he was 19th.
[223] 2007, before it was a big thing, right?
[224] It was kind of getting there, but it wasn't huge.
[225] This is 2007, and this is kind of a boom time.
[226] You know, the iPhone has just come out.
[227] That's the Facebook app era.
[228] Remember that?
[229] Yeah.
[230] People are becoming millionaires by making Facebook games, and he wants to be the next big app guy.
[231] Being there is a lot different reading about it online.
[232] You absorb a lot more of how people get things done, and you learn a lot more secrets.
[233] What did he build?
[234] So the Facebook apps were like, would you rather apps and kind of like romantic gifts.
[235] Did some of the first iPhone games as well.
[236] One of his most recent apps was called Trump Hair, and it was an app for adding Trump's hair to your photos.
[237] That's it.
[238] That's it.
[239] The tagline was, it's going to be huge.
[240] Okay.
[241] So how do you move from a Donald Trump Hair app to something that's a. seems like it could revolutionize police work.
[242] Well, he moved to New York, and that seemed to be a big change for him.
[243] And he started meeting very different people.
[244] One of the most important people he met was Richard Schwartz.
[245] I ended up meeting Richard at a party.
[246] This 61 -year -old guy who worked for Mayor Rudy Giuliani in the 1990s, who was just very politically connected.
[247] I really love that.
[248] He had a lot of stories.
[249] And then we talked for like an hour about different ideas.
[250] I was like, this is what I do.
[251] Technology, I can make anything.
[252] And it went from there.
[253] And the two of them decided with Juan Tontat's tech know -how and Richard's Rolodex that they wanted to try to start a facial recognition company together.
[254] And why facial recognition?
[255] Why did the two of them choose that?
[256] I think it was because Juan had started reading a lot of papers about facial recognition and machine learning.
[257] I had never really studied AI stuff before, but I could pick up a lot of it.
[258] And I think they realized they could make it.
[259] make money doing it.
[260] What were you thinking?
[261] In terms of, like, the range of ideas at first, what were you thinking?
[262] A lot.
[263] I could go on.
[264] It's really crazy.
[265] There's a lot of face recognition algorithms out there and a lot that work pretty well.
[266] What was different about what Juan Tan Thatt and Richard Torts were doing is they had been willing to scrape all of these photos from the Internet.
[267] So they just had a huge database of photos.
[268] Right.
[269] Billions of photos.
[270] Exactly.
[271] And then we hit this point where we got to like 99.
[272] I remember that we were just in the office.
[273] It was like, wow, it works.
[274] Try that one again.
[275] Try that one again.
[276] And just every time I would pick the right person out, and that's when you knew, this is crazy.
[277] This actually works.
[278] Is that legal?
[279] Can you just take photographs from anywhere on the Internet and use them for this kind of thing?
[280] There was a ruling in a federal court this fall that said, yeah, this kind of public scraping seems to be legal.
[281] And what are they hoping to do with this software at this point?
[282] I mean, they're just trying to figure out how they can make money off of the app.
[283] And so they eventually end up settling on law enforcement.
[284] And they started solving cases from rainy ATM photos, cases they would have never solved.
[285] So this kind of spread to different departments and then from one agency to other agencies.
[286] And do you ask him about that thing that happened with the officer who couldn't find your photos?
[287] Yeah.
[288] So that was one of my questions.
[289] and I wasn't entirely satisfied by his answer.
[290] One thing that surprised me, some of the officers I talked to tried to run my photo through it, and they got no hits.
[291] And I have, like, tons of photos online?
[292] It must have been a bug.
[293] Do you guys block me from, like, getting results?
[294] I don't know about that.
[295] Because I was like, this doesn't make any sense.
[296] He said, oh, yeah, that was a software bug.
[297] But he laughed.
[298] I was like, I have a thousand photos online.
[299] This can't work as well as they say it works.
[300] Yeah, well, it must have been a bug.
[301] in the software or something.
[302] Why did you do that?
[303] Don't make me think that's that.
[304] Maybe it doesn't work.
[305] You never know, right?
[306] This could be the long con. It works.
[307] What do you think that was about?
[308] I don't think it was a software bug.
[309] It's about it.
[310] I don't know.
[311] You have no idea, huh?
[312] Huh.
[313] Yeah.
[314] So he said the software bug is now fixed.
[315] Oh, yeah.
[316] So I'll show you.
[317] This is the iPhone version.
[318] And he took a photo of me. Oh, it does work.
[319] Oh, that's so surprising.
[320] I know.
[321] And there, the results included a bug.
[322] bunch of photos of me online.
[323] Oh, my God, I totally forgot.
[324] That was 10 years ago.
[325] Including some I had never seen before.
[326] I didn't know her online.
[327] So he's just brushing off this weird thing that happened to you.
[328] But do you get the sense that he's thinking at all about privacy?
[329] So I asked him, you know, this is a very powerful app.
[330] And I asked him, what restrictions is he thinking about for it?
[331] And he said, you know, one, that they were only selling it to law enforcement right now.
[332] though it does turn out that they're also selling it to a few private companies for security purposes.
[333] But he said they wouldn't sell it to bad actors or bad governments.
[334] And our philosophy is basically if it's a U .S.-based, both like a democracy or an ally of the U .S., we will consider it.
[335] But like no China, no Russia or anything that wouldn't, you know, be good.
[336] So if it's a country where it's just governed terribly or whatever, I don't know if you feel comfortable, you know, selling to a certain country.
[337] So it doesn't sound like he has much of a rubric for deciding who to sell to.
[338] And it sounds like there's no one really overseeing how he's making these decisions.
[339] At this point, it's just up to clear view to decide who they want to sell the app to.
[340] It's on pressure, but like when we talk to some venture capitalists, they're like, why do you make this consumer?
[341] You know, law enforcement is such a small market.
[342] You won't make that much money.
[343] And we've considered it.
[344] And it was just like, what's the use case here?
[345] And there are, you know, right now we can help catch pedophiles.
[346] What if a pedophile got access to this goes around the street?
[347] But when I was talking to one of their investors, he says, you know, we want to dominate the law enforcement market.
[348] And then we want to move into other markets like hospitality, like real estate.
[349] And he predicted that one day, you know, all.
[350] consumers will have access to this app.
[351] I can tell you that one of your investors hopes that you guys are going to go into the consumer.
[352] Yeah.
[353] Talks too much.
[354] But like we're not going to do that.
[355] I just don't.
[356] Juan seems to be saying, yeah, there's pressure on us to sell to private consumers, but we're not going to do that.
[357] And how reasonable is it to think that he has control or the company has control at this point over where this technology goes?
[358] I mean, one point that I made when I was talking to him is that oftentimes the tools that law enforcement use end up in the hands of the public.
[359] I just, I personally feel like you guys have kind of opened the door to now this becoming more normalized.
[360] Just because a lot of tools that law enforcement have eventually make their way into the public hands.
[361] Not always.
[362] Everyone has a gun.
[363] Right?
[364] Anyone who wants one can get one in the U .S. basically.
[365] His response was strange.
[366] He said, well, look at guns.
[367] Law enforcement has guns, but not everybody has a gun.
[368] And I don't know if that's because he's from Australia.
[369] He's proving your point in the way.
[370] He did seem like he was proving my point rather than rebutting it.
[371] You know, we've been building the technology to make this possible for years now.
[372] Facebook building this huge database of our photos with our names attached to it, advances in image recognition.
[373] and search technologies.
[374] It all led us here.
[375] But there's been no accompanying regulation or rules around how the technology should be used.
[376] There's no real law or regulation that makes this illegal.
[377] The scraping seems to be okay.
[378] We don't have a big ban on face recognition.
[379] We don't to give consent for people to process our faces.
[380] And so in terms of holding this tool back, we're just relying on the moral compasses of the companies that are making this technology and on the thoughtfulness of people like Juan Tontat.
[381] But what do you think about that?
[382] Do you think that this is kind of too dangerous a tool for everybody to have?
[383] I have to think about that.
[384] to get back to you on an answer because it's a good question.
[385] Yeah.
[386] I've thought about it a little bit.
[387] I have.
[388] I have, but I need to really, you know, come off there.
[389] A good answer for that.
[390] But honestly, like, yeah.
[391] Thanks, Kashmir.
[392] Thank you.
[393] Since Kashmir began reporting on Clearview AI, several major social media companies, including Facebook, Twitter, and Venmo, have demanded that the company stop using photos scraped from their websites.
[394] But it's unclear what if any power those social media companies have to force Clearview to comply.
[395] A few weeks ago, the state of New Jersey barred law enforcement from using Clearview's technology, but police remain free to do so in 49 other states.
[396] We'll be right back.
[397] Here's what else you need to know today.
[398] President Trump has begun a campaign of retribution against witnesses in the impeachment inquiry, firing Gordon Sondland, his ambassador to the European Union, who called the president's actions toward Ukraine, a quid pro quo, and Lieutenant Colonel Alexander Vindman, a member of the National Security Council, who expressed alarm over the president's phone call with the leader of Ukraine.
[399] The Times reports that several Republican senators urged Trump not to fire the witnesses, fearing it would send a dangerous message, but that the president ignored their advice.
[400] And the global death toll from the coronavirus has reached more than 800 surpassing that of the SARS epidemic, which killed 700 at 74 in 2003.
[401] The number of confirmed infections from the coronavirus now stands at more than 37 ,000.
[402] Finally, new polling in New Hampshire, which will hold its primary tomorrow, shows Mayor Pete Buttigieg, neck with Senator Bernie Sanders and former Vice President Joe Biden, slipping into fourth place.
[403] Vice President Biden, the first question is for you.
[404] In the last few days, you've been saying, The Democrats would be taking too big a risk if they nominate Senator Sanders or Mayor Buttigieg.
[405] But they came out on top in Iowa.
[406] What risks did the Iowa Democrats miss?
[407] The poll, conducted by the Boston Globe, WBC, and Suffolk University, suggest Buttigieg is benefiting from a strong performance in the Iowa caucuses and that Biden may perform poorly for the second time in a row.
[408] A prediction Biden confirmed during Friday night's debate on ABC.
[409] Oh, they didn't miss anything.
[410] It's a long race.
[411] I took a hit in Iowa, and I'll probably take a hit here.
[412] That's it for the daily.
[413] I'm Michael Barbaro.
[414] See you tomorrow.