Freakonomics Radio XX
[0] All right, here's a question.
[1] How many men are gay?
[2] About 5%.
[3] Does advertising work?
[4] Yes.
[5] Why was American Pharaoh a great racehorse?
[6] Big left ventricle.
[7] Is the media biased?
[8] Yeah, it gives you what you want to read.
[9] Are Freudian slips real?
[10] No. Who cheats on their taxes?
[11] Everybody who knows how to cheat.
[12] Who is this person?
[13] And how does he know these things?
[14] That's what you'll find out on this episode of Freakonomics Radio.
[15] That and a lot of us.
[16] other things, some of which are pretty disturbing.
[17] For instance, how Google searches for a particular racial epithet can spike after certain events.
[18] There was a big increase in searches when Obama was elected.
[19] It turns out that Google search data can tell us a lot about ourselves that we may not even tell ourselves.
[20] People just are in such a habit of lying in their day -to -day life.
[21] People lie to their, you know, partners or their kids or their parents.
[22] And when you drill down into this ruthlessly honest database, you're bound to be surprised.
[23] The top search that starts my husband wants in India is my husband wants me to press feed him.
[24] From WNYC Studios, this is Freakonomics Radio, the podcast that explores the hidden side of everything.
[25] Here's your host, Stephen Dubner.
[26] There have been quite a few prominent terrorist attacks in recent years.
[27] An explosion at the Stad de France.
[28] North of Paris.
[29] Berlin early this morning.
[30] A weapon of mass murder is slowly removed.
[31] Today, London suffered a horrific attack near Parliament Square.
[32] Most of these attacks have one thing in common.
[33] The enemy is, in fact, radical Islam.
[34] An ideology.
[35] Afterward, politicians tend to encourage unity.
[36] London is the greatest city in the world.
[37] And we stand together in the face of those who seek to harm us and destroy our way of life.
[38] And they encourage us to not equate Islamist terrorists.
[39] with Islam.
[40] The attacks have nothing to do with Islam, which is followed peacefully by millions of people around the world.
[41] How effective is this sort of encouragement?
[42] Okay, do you want to just tell me if I'm talking too long?
[43] That is Seth Stevens Dividowitz.
[44] I'm an economist, a data scientist, and an author.
[45] And he has studied the effectiveness of this kind of political speech.
[46] This speech, for instance.
[47] Good evening.
[48] President Obama was responding to a terrorist attack in San Bernardino, California.
[49] A Muslim husband and wife shot and killed 14 people and seriously injured another 22.
[50] We cannot turn against one another by letting this fight be defined as a war between America and Islam.
[51] The speech was really, I thought, beautiful and kind of moving.
[52] It's our responsibility to reject proposals that Muslim Americans should somehow be treated differently.
[53] Because when we travel down that road, we lose.
[54] He talked about how important religious tolerance has been to America and how everyone has a responsibility to not give in to fear, but really appeal to freedom.
[55] And everybody has a responsibility to not judge people based on the religion and not give religious tests when deciding who enters this country.
[56] Shortly after the San Bernardino attack, Stevens Davidowitz and a colleague, Evan Soltas, published a piece in the New York Times called The Rise of Hate Search.
[57] The primary evidence came from Google search data.
[58] So search like kill Muslims and I hate Muslims and Muslims are evil or, you know, really, really nasty searches.
[59] They looked at the frequency of that kind of search before, during, and after Obama's speech.
[60] We were founded upon a belief in human dignity.
[61] It was a very, very well -received speech.
[62] So, did the speech curtail anti -Muslim Google search?
[63] searches.
[64] We found that all the searches during the speech actually went up where he was saying that it was our responsibility to reject fear and it is our responsibility to not judge people based on religion.
[65] Let's make sure we never forget what makes us exceptional.
[66] But searches against Syrian refugees were going up and searches to kill Muslims were going up and searches for I hate Muslims were going up.
[67] So it seemed like everything that Obama was doing, even though all the traditional sources were saying that he was doing a great job was actually backfiring.
[68] in terms of its real goal, which was to calm and angry mob that had been inflamed by these San Bernardino attacks.
[69] So the book you've written is called Everybody Lies, Big Data, New Data, and what the Internet can tell us about who we really are.
[70] I understand that's not the title you were wanting.
[71] The title that I wanted for my book was, how big is my penis, what Google searches reveal about human nature.
[72] My publisher was like, you know, like people would be embarrassed to buy that in an airport.
[73] And what share of the data that you're writing about in the book is Google data?
[74] You have other sources as well.
[75] Yeah, it's not all Google data.
[76] I use data from anonymous and aggregate data from a porn hub.
[77] I scraped some websites.
[78] So I scraped a hate site stormfront.
[79] I scraped Wikipedia.
[80] I use some Facebook advertising data and some other stuff.
[81] sources.
[82] And just tell us quickly your academic background, what you studied and where?
[83] I have a BA in philosophy from Stanford and a PhD in economics from Harvard.
[84] Why'd you study philosophy?
[85] Just curious about it?
[86] I had big questions about like the meaninglessness of life and the absurdity of the human condition and stuff.
[87] But they weren't really answered.
[88] I just got more and more depressed and then so I stopped.
[89] But did your change in vocational course, PhD and econ and now doing what you do now, They shed some light on the big existential and philosophical questions?
[90] No, I'm just trying to, like, ignore them, not think about them, but, yeah.
[91] He didn't really ignore the big questions about the human condition.
[92] He just found a different window through which to seek the answers.
[93] I was getting my economics PhD, and I found that Google had released basically data on searches, where people made searches, when people made searches.
[94] And I kind of became obsessed with this data to the point I really couldn't think about anything else afterwards.
[95] And so my dissertation was entirely on things we could learn about people from Google searches.
[96] So I studied racism and child abuse and predicting turnout.
[97] That was my dissertation.
[98] In 2006, Google began making its search data public through a tool called Google Trends.
[99] The data is all anonymous and aggregate, so it's how many people make searches.
[100] in a given city or a given state over some time period.
[101] Stevens -Dividowitz's insight, not that he was the only one with this insight, was that within the privacy of their own Internet browsers, people are more likely to express their true preferences than they would in the traditional surveys and other data -gathering methods that researchers historically use.
[102] Those are suspect to what's known as the social desirability bias.
[103] Social desirability bias is basically that you want to look good in a survey.
[104] So instead of saying the truth, you say what is desirable.
[105] So anything that is socially unacceptable will be underreport in surveys.
[106] So a classic example that we know is if you ask people, if you voted in the previous election, a huge percentage of the people who don't vote say that they vote because it's considered socially undesirable to not vote in an election.
[107] How then do economists feel about surveys?
[108] Economists kind of hate surveys because you can't really trust what people tell you.
[109] You have to see what they actually do.
[110] And you have to pay attention to incentives.
[111] So a problem with surveys is you don't really have any incentive to tell the truth.
[112] Whereas if you're online, you have an incentive to tell the truth to get the information that you actually need.
[113] So considering that most surveys are done either anonymously or with someone that you have zero repeat transaction, with, why do you think the human animal is predisposed toward, you know, protecting or burnishing their reputation, even in a case where the stakes are like, they almost couldn't be lower.
[114] Why do you think we do that?
[115] People just are in such a habit of lying in their day -to -day life.
[116] People lie to their, you know, partners or their kids or their parents that these behaviors kind of carry over in surveys.
[117] How many lies have you told me already in this conversation?
[118] I can't I don't know you think like fewer or more than 10 I think I'm being pretty honest I actually think that and you can and listeners can decide whether they agree with this I think I'm like a compulsively honest person you say you're compulsively honest the title of your book is everybody lies so plainly you're drawing yourself as an outlier I could have said like 98 % of people lie but that wouldn't have been a sellable book right Do you think your compulsive honesty pays off, or do you feel that compulsive honesty really makes your life more difficult, and that lying is actually a pretty overall, obviously, there's a million variations in shadings, but that lying overall is a pretty sensible strategy for, you know, life?
[119] I think it is sensible.
[120] I've learned that I've just, I just started, like, changed my dating profile.
[121] I had, like, a really kind of just okay picture or maybe even a mediocre picture because I didn't want to be misleading.
[122] And I was getting, like, no dates.
[123] And I'm like, wait, this is stupid.
[124] So then I changed to, like, a really, really good picture.
[125] And I'm like, oh, that's what everybody does.
[126] That makes a lot of sense.
[127] Was it still you in the picture?
[128] It's still me, but it's lying by, you know, by emphasis, right?
[129] Uh -huh.
[130] I thought.
[131] And what's the progress been on the dating front?
[132] Much better, with a better picture.
[133] So, when we are putting out information about ourselves, we may lie.
[134] But when we want to find information, via Google, let's say, there's no incentive to lie.
[135] That wouldn't get you the results you want.
[136] So we open up.
[137] We tell Google our secrets.
[138] There are lots of bizarre questions, not just questions, but statements that people make on Google.
[139] So, you know, I'm sad or I'm drunk or I love my girlfriend's boobs.
[140] Like, why are you telling Google that?
[141] I think it feels like kind of a confessional window where people just type statements with no reason that Google would be able to help.
[142] You write in the book, the microscope showed us there's more to a drop of pond water than we think we see.
[143] The telescope showed us there's more to the night sky than we think we see.
[144] And new digital data now show us there's more to human society than we think we see.
[145] So I love that thought.
[146] I'm not sure I believe it in that I'm not sure the ramifications.
[147] will be so large because, you know, the societal insights you're talking about are often just a refinement or even a confirmation of what we've already learned through centuries of, you know, philosophy and psychology and other fields of inquiry.
[148] So tell me why you're so convinced that this revolution will be as big as the others.
[149] Well, I don't think that we're just learning things that we already know.
[150] I think we're learning a lot of things that we had no idea about the ways in which our intuition.
[151] was way off about people.
[152] So if you talk about like what makes people anxious, that's like a huge question, right?
[153] Like, and I did a couple studies.
[154] I said, does anxiety rise after terrorist attacks?
[155] And you can see Google searches for anxiety in places after a terrorist attack.
[156] They don't seem to rise.
[157] And you can say, like, does anxiety rise when Donald Trump is elected?
[158] Everyone's saying they're all anxious.
[159] There's no rise in anxiety there.
[160] So that's like pretty much changes how we think about society.
[161] Like, that's pretty revolutionary relative to the data we've had on human beings before.
[162] And I think there are lots of things about people that we just had no idea about.
[163] One of my favorite examples, and this is just bizarre, but the top search that starts my husband wants in India is my husband wants me to breastfeed him.
[164] And that nobody knows about.
[165] And, like, literally, after I published that finding, and, like, they started interviewing people in India about this finding.
[166] knew about it.
[167] Like, doctors are like, we've never heard of this.
[168] But, like, the fact exists.
[169] Like, there are a reasonable number of men in India and, like, much higher than in any other country that have this desire, but they don't tell anybody because it's secret.
[170] So those things exist.
[171] There are basically facts about human nature that we didn't know because people don't talk about them.
[172] Some of the facts about human nature are unsettling, to say the least.
[173] Stevens Davidowitz spent a lot of time looking for racial hatred as evidenced by the use of the N -word.
[174] In the time period I was studying, it was about as frequent as searches like migraine and economist and Lakers and Daily Show.
[175] So it wasn't a fringe search by any stretch of the imagination.
[176] I think it was about 7 million total searches.
[177] He found that searches like this would rise and fall.
[178] Underwater, here in New Orleans tonight, after the giant storm came the rising waters.
[179] They rose a lot during Hurricane Katrina.
[180] There were all these depictions on the media of African -Americans in a real struggle.
[181] An Army National Guard helicopter today rescued people from rooftops, fragile islands in the floodwaters.
[182] And disturbingly, people were making an unusually large number of searches mocking African Americans during that period.
[183] And also, they rise a lot every year Martin Luther King Jr. Day, which is also disturbing.
[184] Free at last.
[185] Free at last.
[186] Thank God Almighty.
[187] We are free at last.
[188] There was a big increase in searches when Obama was elected.
[189] Hello, Chicago.
[190] That was the week with among the highest searches in the history of Google search for racist material.
[191] If there is anyone out there who still doubts that America is a place where all things are possible.
[192] On the night Obama was elected in 2008, Stevens Dividowitz found that, of all of all, all the Google searches that included the word Obama, 1 % of them also included either the N -word or KKK.
[193] Which may not sound like a huge amount, but one in a hundred, when you think of all the reasons to Google Obama that night, I mean, he's the first black president.
[194] You can Google about his victory speech or his family or his history or lots of other things about him.
[195] I was pretty shocked by how frequently people found the negative reason to make that search.
[196] United States of America.
[197] So that's when these racist searches were happening.
[198] How about where?
[199] That was also surprising.
[200] If you had asked me, where are racist searches highest in the United States or where is racism in general highest in the United States?
[201] I would have said that it's a southern issue, right?
[202] Like you think of the history of the United States, slavery, Mississippi, Louisiana, Alabama.
[203] Those states are definitely among the highest.
[204] But other areas that are right near the top or even at the top, the number one state is West Virginia, and then Pennsylvania, particularly western Pennsylvania, eastern Ohio, parts of Michigan, it's very, very high, industrial Michigan, upstate New York.
[205] There's really not a big difference between north versus south.
[206] It's east versus west, where it drops substantially once you get west on the Mississippi River, these racist searches.
[207] So let me ask you to just talk a little bit in more detail about the map of racism and how it related to the last several presidential elections.
[208] Yeah.
[209] Well, then I was reading this paper by some economists at Berkeley.
[210] They were using general social survey data to measure racism.
[211] And they had asked the question whether racism played a factor in Obama's vote total in 2008, even if he won, did he lose votes because of racism?
[212] And they concluded using this general social survey data that it was not a factor, that racial attitudes were not a big predictor.
[213] But again, learning what we've learned from talking to you today, we have to say, well, wait a minute, anything like that based on survey data is.
[214] suspect.
[215] Was that your first thought as well?
[216] Yeah, maybe that suspect, like would the Google searches show anything different?
[217] So you can't really just compare how many votes Obama got in places where racism is high and racism is low because those areas may have opposed any Democratic candidate, right?
[218] But what you can do is you compare how did Obama do to the previous Democratic candidate, John Kerry, who was white and had similar views and how did he compare to other Democratic candidates?
[219] And when you do this, you see very, very clearly like a really, really strong relationship that places that make loss of racist searches, Obama got substantially fewer votes than other Democratic, white Democratic candidate.
[220] So you're telling us in retrospect that Obama was in some ways an even stronger candidate than he was, right, winning two elections despite substantial bias against a black candidate?
[221] I calculated they lost about four percentage points from racism.
[222] He also got about one to two percentage points from increased African -American turnout.
[223] But yeah, on balance, yeah, I think he's like the most charismatic president in history and charisma accounts a lot in politics.
[224] So what does this say generally about overt or public versus covert or private racism?
[225] Well, so over the past 10 or 15 or 20 years in the social sciences, they've been trying to answer a big paradox, which is that African Americans have very, very bad life outcomes, but white people say they're not racist, right?
[226] And the traditional answer to this is implicit bias.
[227] So like you, I, everybody listening, all of us have some.
[228] subconscious associations between some negative outcomes and black people.
[229] And this has been used to explain why African Americans are struggling.
[230] And I think one of the things that this research shows is probably that explicit racism may be playing a bigger role, not this implicit subconscious stereotypes that have dominated the research in the last 20 years or so.
[231] What did your map of racism predict or tell you about the election of Donald Trump?
[232] Well, I didn't actually do this, but Nate Cohn, he's kind of a stats guy at the New York Times.
[233] He got data on Trump support in the Republican primary.
[234] And he asked me for the explicit racism data.
[235] And he said that it was the biggest predictor he could find of Trump's support in the primary was this Google racism data stronger than education or age or lots of other things.
[236] And what can that tell you or what can you tell us about Hillary Clinton?
[237] I mean, if Obama carried the day, twice with the anti -black bias.
[238] Can you tell us anything about whether the anti -female bias against Hillary Clinton may have been enough to change the outcome?
[239] No, that's like a question.
[240] I think I get an email once a week asking me to look into that.
[241] I think it's a little bit harder with African -Americans.
[242] There's pretty much one word that is searched more than every other potentially racist word.
[243] I can think of one word that Hillary's been called a lot.
[244] That would probably get you fairly far, no?
[245] Well, the issue with sexism is that a lot of the next negative words are also porn searches.
[246] Coming up on Freakonomics Radio, is Seth Stevens -Dividowitz's work being acted upon by people in high places?
[247] I think possibly someone from Obama staff read it because a few weeks later he gave another speech at a Baltimore mosque.
[248] And how often do you have sex?
[249] Several times per week.
[250] Maybe once or twice a week, if I'm supposed to average it.
[251] I think that they'll be exaggerating how often they're having sex.
[252] That's coming up right after this break.
[253] It was while getting his Ph .D. in economics that Seth Stevens -Dividowitz started using Google search data to try to better understand the world.
[254] So it seemed natural to use those data for his dissertation.
[255] What did his thesis advisors think this idea?
[256] They all liked it, but they were like, you might not get an academic job.
[257] Did you care about an academic job?
[258] Yeah.
[259] I thought I wanted to be a professor.
[260] So, yeah, I did care, but I didn't get one.
[261] about the difficulty of getting the dissertation published?
[262] So it was considered kind of weird to use Google search data and kind of got some angry responses from journals and the academic markets.
[263] I didn't think it was weird, but everyone was telling me it was weird, which is kind of like my life.
[264] I'm always weird, and don't think I'm weird.
[265] Part of his dissertation eventually was published as a paper in the Journal of Public Economics.
[266] It was called The Cost of Racial Animus on a Black Presidential Candidate, Evidence, Evidence using Google search data.
[267] It didn't get him a job in academia, but it did help get him a job at Google.
[268] Hal Varian, the chief economist there, he liked my work.
[269] He's kind of like also, I think weird and doesn't realize he's weird maybe.
[270] And he was obsessed with Google data like long before I was and kind of started this whole thing.
[271] So we really bonded.
[272] And then what did you do there and how long were you there?
[273] I was there for about a year and a half.
[274] It's kind of like in -house consulting maybe.
[275] Like Google doesn't really outsource this.
[276] they're consulting to like McKinsey or they kind of like having a team inside who understands their data and can help help them make decisions.
[277] What kind of Google data were you interpreting and then telling Google about?
[278] A lot of like advertising stuff.
[279] Your tone of voice implies lack of thrill.
[280] Is that the case?
[281] That's why I quit.
[282] Would you've been able to write the book that you've written?
[283] Were you still working at Google?
[284] I think I probably could have, but I maybe would have had to have better social skills to deal with the PR department.
[285] I think my social skills have improved a lot in the last two years, but they weren't that great when they tacked was not in my skill set in my 20s.
[286] Give me an example.
[287] What do you mean by that?
[288] I'd just, like, be very aggressive and, like, thought I knew all the answers and stuff, so.
[289] And how old do you know?
[290] 34.
[291] So you're done with that phase of your life?
[292] I hope so, yeah.
[293] Talk for a minute about, I guess, your level of confidence or, you know, your argument for the strength of the evidence in that a search, a Google search, is, I would call it sort of a proxy for some behavior or question or activity or whatnot.
[294] So it's not the fact itself.
[295] It's not the data itself, but it's a query representing what seeming to represent the fact itself.
[296] So can you talk for a minute about how substantial you feel the relationship is between the search and the thing?
[297] And what gives you that confidence?
[298] There have been a lot of examples where people have correlated searches with real -world behaviors.
[299] So there's one study that compares searches for suicide, and these correlate highly with actual suicides.
[300] And the Google searches for suicide correlate much higher than surveys for suicide.
[301] I've done research on you can predict how many people will turn out to vote based on whether people search where to vote or how to vote before election, these correlate higher and much higher than surveys with how many people actually turn out to vote.
[302] These crazy searches kill Muslims, like these really is a nasty searches about Muslims.
[303] I've shown with Evan Soltas then at Princeton that these correlate with hate crimes against Muslims.
[304] So I think the fact that over and over again they correlate and usually correlate much stronger than other data sets is proof that even some of the stranger searches have real information in them.
[305] Real information that may, in some cases, be useful.
[306] So if you're talking about people who search kill Muslims or I hate Muslims, this is not, you know, your average American.
[307] This is someone with extreme animosity and rage and violent thoughts.
[308] So this kind of a unique sample of people, even if it's small, would be basically impossible to capture in a survey or to find in a university laboratory experiment.
[309] But because Google searches have everybody, they also have this, a small, tiny mob.
[310] And we can study really for maybe the first time what actually inflames an angry mob and what actually calms down an angry mob.
[311] As we heard earlier, anti -Muslims searches rose when President Obama was trying to calm things down after the San Bernardino attack.
[312] ISIL does not speak for Islam.
[313] They are thugs and killers.
[314] But in the last few minutes of that speech, the president changed tack.
[315] Obama talked about how Muslim Americans are American heroes.
[316] Muslim Americans are our friends and our neighbors, our coworkers, our sports heroes.
[317] And yes, they are our men and women in uniform who are willing to die in defense of our country.
[318] A nation of Googlers also changed tack.
[319] You saw for the first time in many years the top descriptor of Muslims on Google was not Muslim terrorists or Muslim refugees.
[320] It was Muslim athletes and Muslim soldiers.
[321] They both skyrocketed and stayed up about a week afterwards.
[322] He was collaborating on this project with Evan Soltas.
[323] So what Evan and I concluded was maybe lecturing people is not the best way to change their mind or to calm them down if they're enraged, but subtly provoking their curiosity, offering a new description of a group that is causing them so much angst is maybe more effective.
[324] And then we wrote this up in the New York Times.
[325] he got some attention, and I think possibly someone from Obama staff read it because a few weeks later he gave another speech at a Baltimore mosque.
[326] Please be seated.
[327] And he really stopped with all the lectures and the sermon, and he instead focused much more on the curiosity -provoking.
[328] So he talked about how not just Muslim athletes and Muslim soldiers, but he talked about Muslim firefighters and Muslim teachers, and how Thomas Jefferson had a copy of the Quran and how Muslim Americans built the skyscrapers in Chicago.
[329] Generations of Muslim Americans helped to build our nation.
[330] They were part of the flow of immigrants who became farmers and merchants.
[331] So he kind of doubled down or quadrupled down on this curiosity strategy, and it does seem like right after these words were spoken, the angry searches about Muslim Americans actually went down.
[332] a drop in searches for kill Muslims, and I hate Muslims after Obama gave this speech.
[333] Stevens Davidowitz's book is stuffed with examples of the behaviors that, according to him, everybody lies about, especially on traditional surveys.
[334] So we recruited some Freakonomics Radio listeners, promised them anonymity, and asked them some typical survey questions, and we asked Stevens Dividowitz to predict what they would say.
[335] So if we asked people how frequently they have sex, what do you think they would say?
[336] I think men will say about one and a half times a week and women will say about once a week.
[337] Oh, I mean, it varies from week to week, maybe once or twice a week, if I'm supposed to average it.
[338] I would say maybe three or four times a month, several times per week.
[339] And then how does that compare to the reality as best we know?
[340] I think that they'll both be exaggerating how often they're having sex.
[341] And how do you know that they're exaggerating?
[342] I did this comparison.
[343] The general social survey asked men and women how frequently they have sex and whether they use a condom.
[344] And if you do the math on that, then American men say they use 1 .6 billion condoms in heterosexual sexual encounters.
[345] And American women use 1 .1 billion condoms in heterosexual sexual encounters.
[346] And obviously those by definition have to be the same, right?
[347] So, like, you know already that someone's lying.
[348] but then I got data from Nielsen on how many condoms are sold every year in the United States and only 600 million condoms are sold every year.
[349] And that doesn't mean that they're lying about how much sex they're having.
[350] They might just be having more unprotected sex.
[351] But if you actually look at like the best math on how frequently people get pregnant, if people are having as much unprotected sex as they say they're having, there would be more pregnancies every year in the United States.
[352] Right.
[353] Although then you have to factor in terminations as well, correct?
[354] Yeah, even including how many abortions are.
[355] So, in other words, bottom line is people lie a lot to its significant degree.
[356] Like, what would you put the rate of exaggeration at for sex generally, sex frequency?
[357] Like three to one for men and two to one for a woman.
[358] Wow.
[359] If we ask people, if they watch pornography, what will they say and how accurate will that be?
[360] I thought that everybody would say yes because I thought that in this day and age, at least the males would be okay saying that they watch pornography.
[361] No, no, no. Yes.
[362] Is everybody just saying no?
[363] On occasion.
[364] I think everybody has urges.
[365] They need to be fulfilled.
[366] I had no clue you guys would ask me that.
[367] I just don't.
[368] I never have.
[369] Well, may make up about 20 % of pornography views now.
[370] So that's probably some deception there.
[371] So we asked a bunch of people if they think Super Bowl ads make them more likely to buy the product that's being advertised.
[372] What do you think they'll say and what's the reality?
[373] I think they'll probably.
[374] say no because people don't like to think that they're influenced by ads?
[375] No. No. No. No. I think they increase the awareness, but I don't find many of the Super Bowl ads relevant to me. When I'm looking to buy a product, I don't at least consciously think that I get my information for it from commercials.
[376] The reality is definitely, yes.
[377] The way people have studied this is comparing product purchases in cities of teams that made it versus cities that just missed the Super Bowl.
[378] So you get a big shock to viewership and those cities end up buying those advertised products much more.
[379] So they're clearly very effective.
[380] All right, Seth, I think you make a very persuasive argument that Google search data is, you know, a great tool to figure out who we are and what we care about and so on, especially when it's not going to be revealed in a more traditional way.
[381] But obviously Google search data hardly reveals everything.
[382] So I'd like you to just tell us one thing that's provocative or embarrassing or surprising about you that we will never ever be able to learn from a Google search.
[383] How does it feel now?
[384] Yeah, right.
[385] One thing that's embarrassing or surprising on me that you never learn from a Google search.
[386] You have to get back to us on that?
[387] You want to get back to us by email?
[388] We can note that there were like 18 seconds of incredibly awkward silence followed by an email a week later.
[389] Yeah, that's fine.
[390] Okay.
[391] A week later to the day, Seth Stevens -Dividowitz did send an email.
[392] Subject line, embarrassing thing I have never Googled.
[393] The email read, quote, I am embarrassed and insecure about how I sleep.
[394] I've been told I twitch and jerk like a maniac.
[395] For some reason, I've never Googled this particular issue, but it is possible someone who has shared a bed with me has.
[396] In case you are curious, the top three Googled complaints about male partners are that he talks, twitches, and jerks in his sleep.
[397] The top three Googled complaints about female partners are that she talks, farts, and masturbates in her sleep.
[398] End quote.
[399] That's our show for today.
[400] Thanks for listening.
[401] Again, Seth Stevens -Dividowitz's book is called Everybody Lies.
[402] next time on Freakonomics Radio.
[403] Hi, this is Steve Balmer.
[404] I am a retired CEO of Microsoft.
[405] Steve Balmer's new project?
[406] It's a sort of fiscal colonoscopy on the American government.
[407] If I'm a citizen, I don't want to know just where the government got its money from whom and where it spent it.
[408] But is it working at all, or at least what activity is it generating?
[409] He's also a little bit excited about owning a professional basketball team.
[410] Hoopers, hoopers, hoopers.
[411] That's next time on Freakonomics Radio.
[412] Freakonomics Radio is produced by WNYC Studios and Dubner Productions.
[413] This episode was produced by Christopher Worth.
[414] Our staff also includes Shelley Lewis, Merritt Jacob, Greg Rosalski, Stephanie Tam, Eliza Lamber, Alison Hockenberry, Emma Morganstern, Harry Huggins, and Brian Gutierrez.
[415] We had engineering help this week from Matt Fiddler and Rick Kwan.
[416] Thanks also to our anonymous survey panel, you know who you are.
[417] You can subscribe to Freakonomics Radio on Apple Podcast.
[418] You should also check out our archive at Freakonomics .com, where you can stream or download every episode we've ever made.
[419] You can also read the transcripts and look up the underlying research.
[420] Finally, you can find us on Twitter, Facebook, or even via email at Radio at Freakonomics .com.