Freakonomics Radio XX
[0] Dalton Conley is a sociologist at NYU.
[1] He has a book coming out soon called Parentology.
[2] It's about, well, here, let's have him tell you.
[3] I think the subtitle says it all, which is a social scientist experiments on his kids so you don't have to.
[4] So here they are.
[5] Okay.
[6] So here they are.
[7] You guys want to introduce yourselves?
[8] I don't care who goes first.
[9] Okay.
[10] I'm E like the letter.
[11] I'm 15, and I'm a student.
[12] Okay.
[13] Hi, E. I'm Yo.
[14] In, like, the slaying, I'm 13 and I'm a student, too.
[15] That's right.
[16] Dalton Conley named his daughter E and his son, Yo.
[17] But there's more.
[18] Can you give your full name?
[19] E. Harper, Nora, Jerry Majenko, Conley.
[20] Okay, so E is your first name.
[21] The capitalized E. The idea is that she can choose what it stands for.
[22] Right.
[23] So, E, you still call yourself E at 15.
[24] Do you do so happily?
[25] Yes, I love my name.
[26] I don't blame you.
[27] Once you're called something, your whole life, you can't really change it.
[28] Yo, can you give us your full name?
[29] Yeah, sure.
[30] Yo, Xing, Hano, Augustus, Eisner, Alexander Weiser, Knuckles, Jeremy Janko, Connolly.
[31] So, Yo, okay, where's your first name, Yo, comes from where?
[32] I think it comes from the Y chromosome.
[33] And that we were confounding ethnic stereotypes.
[34] So, you know, plenty of, there's plenty of Howard Chungs out there who assimilate to white, America by how they choose their first name as a classic immigrant strategy.
[35] But there aren't many Conleys who take a Chinese...
[36] Right, going the other way.
[37] Yo was actually born with a slightly less complicated name.
[38] Yo, Augustus, Eisner, Alexander, wiser, Jeremiah Jenko Conley.
[39] The Xing, Hano, and Knuckles were added later when he was about four.
[40] And what about the order where these names were dropped in, the Hano and the Knuckles?
[41] Whose choice was that?
[42] I think that's just pleasing to the ear.
[43] So the obvious question is, why?
[44] Why such unusual, complicated names?
[45] To some degree, it's an experiment.
[46] Because Dalton Conley thinks that who you are, who you turn out to be, may be related to what you're called when you're born.
[47] Of course, it's hard to separate out cause and effect here until Kim Jong -un allows me to randomly assign all the names of the North Korean kids.
[48] But I can't know that I'm weird because I was given a weird name or because my parents are weird and they pass that on.
[49] But my gut tells me that it does affect who you are and how you behave and probably makes you more creative to have an unusual name.
[50] All right.
[51] On balance, for both of you guys, would you say that having an unusual name has been a positive or negative overall?
[52] Well, you can never really know because you can't live another life, but I do think that I'm grateful for my name, and it has been a positive impact.
[53] What is it like to have a dad who's a sociologist who looks at children and people through a lens?
[54] Well, it's trained me a lot in, like, dealing with other adults because, like, when I was a kid, he could know when I'm lying, so I got really good at lying and stuff.
[55] But...
[56] God.
[57] Like, it kind of sucks to be experimented on.
[58] Like, all of a sudden, he's like, guess what, son?
[59] You're not getting computer or TV for a month because I want to see how that goes.
[60] So you've told me how you feel about having your name, but how do you feel about your parents giving you these names?
[61] Well, like, it doesn't really weigh on me at all anymore.
[62] But, like, there's a bunch of people on the internet that get super.
[63] mad about, like, have these angry comments about any article, like, about it.
[64] Like, my dad's been called, like, the retard of the decade and stuff.
[65] Wow.
[66] Naming me that.
[67] Really?
[68] Of the decade?
[69] That's quite an honor.
[70] The F -Tard of the decade.
[71] And does that hurt your feelings or more, like, on your dad's behalf?
[72] No, I found it really hilarious, actually.
[73] From WNYC and APM, American Public Media, this is Freakonomics Radio, the podcast that explores the hidden side of everything.
[74] Here's your host, Stephen Dubner.
[75] Hello, Dubner.
[76] I'm calling to tell you about my name.
[77] My name is Bronwyn Stein.
[78] Arwin, Laurelin, Honokalekyllai, Nix the 3rd.
[79] M -A -H -M -A -H -H -Z -Z -O -R -R -R -A -H -H -J -A -S -E -E -L -Y -N.
[80] You're never supposed to trust a man with two first names.
[81] A -U -N -A -K -U -N -A -H -H -W -A -A -A -A -A -A -A -A -A -A -A -A -A -A -A -A -A -A -A -A -A -A -R -D.
[82] My full name currently is letter M -R -E -M -A -M -A -R -U -G -G -R -U -G -G -R -U -G -G.
[83] Which is both the question and the answer.
[84] So what do you think of someone who names his kids E and Yo?
[85] Well, that probably depends on a lot of things.
[86] Your personal preferences, also your religious and familial traditions.
[87] You may think it's clever and creative.
[88] You may think it's silly, even cruel.
[89] Now, will E and Yo, the people, turn out to be different than if they'd been named Sarah and Jake?
[90] as E put it, very wisely.
[91] Well, you can never really know because you can't live another life.
[92] You can't live another life.
[93] And that's why it's hard to measure something like the effect of a name.
[94] My Freakonomics friend and co -author, Steve Levitt, he has spent his academic career trying to come up with clever ways to measure things.
[95] And he's thought quite a bit about the names we give our kids.
[96] I think ultimately all a name really does is it's a vehicle for the parents to signal what kind of person they are.
[97] It's really a means of establishing.
[98] They are and or the kind of person that they hope their child will become.
[99] I don't even know if I think it's the second.
[100] I think it really is about the parents that as I've studied naming, what I've come to believe is that the primary purpose when a parent gives a name is to impress their friends.
[101] that they are whatever kind of person they want to be in.
[102] And I think some of the best evidence of this comes from the radical revolution in black names that happened in the 1970s.
[103] People don't really remember this.
[104] But if you go back to the 1960s, blacks and whites basically were giving their kids pretty much the same sets of names.
[105] Not really very different, a lot of overlap.
[106] But within about a seven -year period in the 1970s, names just completely diverged.
[107] And among most African Americans now are given names that virtually no whites have.
[108] So what we saw was in a period that really coincides with the black power movement and a very strong move away from the initial civil rights movement was that names changed completely.
[109] And many black parents decided, I think, that the identity they wanted for their children was one that was distinct from white culture.
[110] Now, the fact is that black and white names 100 years ago could be really different, too.
[111] Black baby boys were often given names that relatively few whites had.
[112] Ambrose and Booker, Moses and Percy.
[113] And the modern equivalents, Deshawn and Marquise, Tyrone and Demetrius.
[114] Some years back, Steve Levitt started to wonder if these distinctively black names mattered.
[115] That is, whether they affected, for better or worse, the life of a kid was such a name.
[116] So Leavitt did some research with Roland Friar, an economist at Harvard, who was devoted to explaining the gap between blacks and whites in education, income, and culture.
[117] Here's Leavitt again.
[118] We didn't really care about black names, per se.
[119] What Roland and I were trying to get at was black culture.
[120] So the idea was we knew that we observed really big differences in economic outcomes for African Americans and for whites.
[121] We know we observe really big cultural differences between African -Americans' whites.
[122] And the question was, is there any causal link between those two?
[123] Could it be that somehow black culture was interfering with black economic success?
[124] And the difficulty whenever you start talking about things like cultures, how do you quantify it?
[125] How do you capture what culture means in a way that an economist in data would find it?
[126] And so what we settled on was the idea that you could use names.
[127] as an indicator of culture.
[128] Because the set of names that parents choose are very different for blacks and whites, and they also reflect the way that people think about the world.
[129] My name is Hawa, Nana, Aradua, Lasana.
[130] My parents are Ghanaian and Liberians, and I'm first generation African -American.
[131] My first name is Shana.
[132] It's a Jewish name, but people always seem to think I'm black.
[133] My name is Stefan, and I'm coincidentally African -American.
[134] So people automatically assume that my name is Stefan when they're reading it, even though it's spelled S -P -E -N.
[135] So the ultimate question we wanted to answer is, does your name matter for the economic life that you end up leading?
[136] Are people who are, quote, saddled with distinctively black names facing a burden when they enter the labor market?
[137] So wanting to study names and having the right data set are two different things.
[138] But we managed to stumble onto an amazing data set that was kept by the state of California.
[139] It encompassed the birth certificate of every person born in the state of California between 1960 and the year 2000.
[140] And it included the name of the baby, the first and last name, the first and last name of the mother and the maiden name of the mother, along with a lot of other information about the hospital and the kind of health care that the mother had, gave you a hint at some of the economic circumstances.
[141] And this turned out to be the absolutely perfect data set to do what we wanted to do.
[142] What we could do is we could match up two young African -American girls at birth, say born in 1965, who are born at the same hospital about the same time to a set of parents who, on all the data we have, look very similar, except that one of those sets of parents give their daughter a, distinctively black name, like Shaniqua, say.
[143] And the other set of parents give their baby a more traditional white name like Anne or Elizabeth.
[144] And so what do we do?
[145] We follow those girls.
[146] We fast forward, say, 25 years into the future when those girls grow up in California and have babies themselves.
[147] And so from when they give birth, we can see what kind of lives they're leading, whether they have fancy health care, whether they're married, how old they are when they are.
[148] have babies, things like that.
[149] And we get a glimpse into their economic life.
[150] It's not perfect.
[151] We certainly don't know everything about them, but we know certain things about them.
[152] And we were able to see something quite remarkable, which is that the name that you were given at birth seem not to matter at all to your economic life.
[153] Remember that conclusion.
[154] The name you were given at birth does not seem to matter at all to your economic life.
[155] In other words, it's not the name your parents give you, it's the kind of parents you have in the first place.
[156] And different kinds of parents, of course, choose different kinds of names.
[157] So let's say two similar families, both African American, each have a baby girl.
[158] One is called Molly, which it happens is one of the whitest girls' names in America.
[159] And the other is called Latanya, which is a distinctively black name.
[160] Now, if decades later, Molly becomes let's say a professor at Harvard, and Latanya is just barely scraping by.
[161] Well, the reason won't be because Latania's parents named her Latanya.
[162] Begin, if you would, just by introducing yourself, say your name and what you do.
[163] Sure, I'm Latanya Sweeney.
[164] I'm a professor of government and technology here at Harvard.
[165] Okay, so there you go.
[166] At Harvard, Latanya Sweeney studies how technology can help solve society's problems.
[167] In the course of doing so, she occasionally discovers a new problem.
[168] Like the day not long ago, when she and a colleague named Adam Tanner were working in her office.
[169] He and I were working on a different project and he needed to find a paper of mine.
[170] So he went to my computer and Googled my name.
[171] And along with the links to various papers and so forth, this ad popped up to the right.
[172] that said Latanya Sweeney arrested.
[173] And I basically almost fell out of the chair because one, I'd never been arrested.
[174] And then my name is so unusual that it's hard to imagine that that could have been a mistake.
[175] And the name appeared right in the ad.
[176] So then we typed in his name and his name, a white male name, Adam, Tanner.
[177] And the same company had an ad.
[178] But the ad just said, looking for Adam Tanner.
[179] It was very neutral.
[180] It didn't have any, the word arrested and show up.
[181] no reference to a criminal record.
[182] So did you immediately become suspicious, or did you just think, well, this is some kind of one -off and let me explore further?
[183] Well, right.
[184] I mean, on the one hand, you think it's one -off, but on the other hand, something kind of fluky.
[185] But on the other hand, you're like, well, why did it happen?
[186] And so we began just entering names and all kinds of names.
[187] And we spent a couple of hours doing so.
[188] The ads were for a company called Instant Checkmate, which sells public records.
[189] The ads appear when you do a Google search for the first and last name of a real person.
[190] But a given name search might generate different versions of the ad.
[191] Some of them are neutral, like looking for Molly Sweeney, and others, like the one Latanya Sweeney found, seemed to offer up arrest records.
[192] Sweeney and Tanner started doing lots of name searches to see if they could find a pattern to the ads.
[193] And we began focusing on Latanya versus Sweeney.
[194] Tanya.
[195] And what we found in each of those cases was if you had Latanya with a last name, you got an ad suggesting that you had an arrest record.
[196] And if you typed in Tanya with a last name, you didn't.
[197] Then Adam jumps to this conclusion.
[198] He says, oh, I get it.
[199] They're coming up.
[200] The arrest ads are coming up when there's a black sounding name.
[201] And I said, that's impossible.
[202] That's crazy talk.
[203] And I eventually got to the point where I said, okay, I'm a scientist.
[204] Let me put on my official science hat and start from step one and I'm going to show Adam he's wrong.
[205] That was the whole goal, was to show him he was wrong.
[206] The goal was never to write a paper.
[207] The goal was to show Adam he was wrong.
[208] The first step for Sweeney was to simply define what is a black name and what is a white name.
[209] So she assembled some data, which included the lists that we created for our first book, for economics, of the whitest and.
[210] blackest names among baby boys and girls?
[211] So the white female names were Molly, Amy, Claire, Emily, Katie, Madeline, Caitlin, Caitlin, and Emma.
[212] The black female names, Imani, Ebony, Janice, Eliah, Precious, Nia, Deja, Diamond, Latania, and Latisha.
[213] The white male names were Jake, Connor, Tanner, Wyatt, Cody, Dustin, Luke, and Jack.
[214] And the black male names Deshawn, D 'Andre, Marcus, Darnel, Terrell, Malik, Trevon, and Tyrone.
[215] In order to prompt the Google ads, Sweeney needed to find real first and last name, some black and some white.
[216] So she would type in a search like Shanice Ph .D. or Molly MBA to find real people, some of whom were like herself professionals.
[217] And then she would feed those real names back into Google to see what ads they would prompt.
[218] So break it down for me, Latania.
[219] Having a distinctively black first name makes it how likely to prompt an ad for an arrest record and compare that to having a distinctively white name then.
[220] Well, a black identifying name was 25 % more likely than a white identifying name to get an ad suggestive of an arrest record.
[221] All right.
[222] So you may be thinking that that may be thinking that that may be.
[223] makes sense because the average black American is more likely to get arrested than the average white American.
[224] Well, what's interesting is these ads appear regardless of whether the company actually has a criminal record for that name in their database.
[225] As most people know by now, Google makes its money with a program called AdWords, which serves ads that are linked to the content that you search for.
[226] Advertisers, like Incent Checkmate, agree to pay a certain amount each time their ad is clicked on.
[227] They provide Google with several versions of ad text, and they can specify which keywords, or in this case, which key names, will prompt each version of the ad.
[228] It is, of course, in the best interest of both Google and the advertiser to serve the ads that will get the most clicks.
[229] The idea of the Google algorithm is it says, okay, we don't know which of these five versions of ads are going to make the most money.
[230] So what we're going to do is we're going to let the algorithm learn over time which one tends to get the most clicks.
[231] So at first, all five ad copies, say, for Ebony Jones, are equally likely to appear.
[232] And so it would randomly pick one on a search of Ebony Jones, display it.
[233] If that one gets clicked, it gets weighted, and so over time, the one having the heaviest weight will get displayed more often.
[234] If we assume for a moment that Instant Checkmate had placed the ads somewhat roughly the same text for all the names evenly, let's just assume that's the case, then an explanation of what we're seeing is that it's basically some kind of bias effect from society.
[235] So people see an arrest ad for a black name, they tend to click it, But when they see the arrest ad associated with a white name, they tend to ignore it.
[236] Okay.
[237] So this is important, though, because when you come out with a finding like this, most people immediately want to search for the villain.
[238] You're saying the villain might be the company.
[239] The villain might be Google.
[240] And the villain might be all of us.
[241] Right.
[242] So let's get back to your name.
[243] So when your name first showed up, when Adam searched for your name on your computer, and the ad that was generated said Latanya Sweeney arrested.
[244] Take me down the road now from there to why that matters, what it implies, what it made you feel personally about your name being there?
[245] And more broadly, what's wrong with that?
[246] In terms of for me personally, it was really this shock factor, you know, that I had never been arrested.
[247] And it's kind of you don't want that associated with you.
[248] You know, like why should that be associated with my name or my image to anyone.
[249] When I put my scientific head on, the question was what does racial discrimination really mean and how do you operationalize it scientifically or statistically?
[250] And so racial discrimination basically results when a person or a group of people are being treated differently.
[251] You either give or withhold benefits, facilities, services, opportunities.
[252] There might be some kind of economic loss or something along those lines that they would otherwise be entitled to, but they're being denied it on the basis of race.
[253] The other thing that I look to in terms of structuring how this fit into societal norms versus technology was realizing that searching online, especially when the ads are delivered by such a huge service like Google ads, it almost begins to harbor this notion of structural racism, that is, that you can't help but it foster a discriminatory outcome.
[254] So two people are in contest.
[255] I Google one name and I end up with an arrest ad.
[256] I Google the other name.
[257] And there's no implication of an arrest at.
[258] Even if I never click it, it has the difference of that implication.
[259] So even though you obviously have a good job now, did it concern you for your future?
[260] No. No. No, I tell you, when I got really moved in that regard was more looking at the faces of the names of these young PhD students and people who, who were just launching their careers.
[261] There was one name.
[262] I forget which name it is, but I remember it was a young woman.
[263] She was so proud she had just published her first paper.
[264] She was a graduate student in a Ph .D. program.
[265] And, you know, you just, and there's her name and there's this ad arrested and how wrong that was.
[266] It just seems so wrong.
[267] For the record, a Google spokesperson told us that, quote, quote, AdWords does not conduct any racial profiling.
[268] It is up to individual advertisers to decide which keywords they want to choose to trigger their ads, end quote.
[269] Instant Checkmate didn't respond to our query, but an official statement from the company about the Tanya Sweeney's study says, quote, Instant Checkmate would like to state unequivocally that it has never engaged in racial profiling in Google AdWords and that we have absolutely no technology in place to even connect a name.
[270] with a