Mike Norton: Allison, I'm wondering if you've heard of "pseudo-profound bullshit“?
Allison Schrager: I have not.
Mike Norton: Can I give you some examples?
Allison Schrager: Please.
Mike Norton: Here's two that I actually use everyday, every morning when I wake up, I say these 100 times each, each in my head to start my day. The first one is, "Wholeness quiets infinite phenomena." The other is, "Attention and inattention are the mechanics of manifestation." What do you make of those?
Allison Schrager: Um, they sound really profound. And I might start saying them, but I'm not quite sure what they mean.
Mike Norton: That's exactly how I feel about them, too. It's things that sound really important, life-changing. Then you think for two more seconds and you realize, "Wait, that doesn't mean anything at all."
Allison Schrager: You know, I see this all the time in finance. You, you have these gurus who are like, "I have the secrets of how to beat the stock market." Or, my favorite is, "I've figured out how the economy works." And they throw around these really big words, often statistical terms, incorrectly. But all they're saying is just like, nonsense. Like, something no better than that old expression, "Sell in May and go away," which is also nonsense.
Mike Norton: What I find so interesting is that our tendencies to believe things, sometimes even just because they rhyme, can affect us in really consequential domains in our lives. It's not just little decisions that we make, but really big decisions that we make, like, who we vote for in elections, who we choose for our partner, and even how we manage our money. The big question we should all be asking these days is, "Why are we so prone to believing bad information?" And, "What can we do to correct course?"
Mike Norton: I'm Mike Norton, and I'm the host of Talking Green. I'm also a social psychologist at Harvard Business School and I study the way people behave, and misbehave. On Talking Green, we explore how psychological forces drive attitudes and decisions around money and investing. This episode, like every episode, I'm joined by Allison Schrager, an economist, journalist, and culture maven. She's the author of the recent book, An Economist Walks Into a Brothel.
Allison Schrager: Today we're going to be looking at news, culture, and social media. And all the ways our brain is set up for us to believe misinformation and fake news. In today's climate, even the most rational of us are susceptible to false information. And it can affect our lives in many ways, even how we think about money.
Mike Norton: This is Talking Green.
Allison Schrager: An original podcast from OroTrade and T Brand Studio at The New York Times.
Mike Norton: So this episode we're going to talk about something that I am fascinated, and repelled, and intrigued by, which is just the prevalence of fake news and, more, even more broadly, fake information. We'll talk a bit about why people seem so prone to believe in, in fake news. But first do you have any kind of favorite examples, or aspects of fake news that drive you crazy, in the way that they drive me crazy?
Allison Schrager: What I see a lot is someone's opinion being presented as fact. And I think this is one of the reasons it's all so seductive.
Mike Norton: Hmm. Do you think that social media, that kind of, we have this in us and then these are just the vehicle through which these kinds of insane things get propagated? Or is it the medium itself is encouraging us to kind of take these extreme opinions, and say wrong things, and then have them spread so widely?
Allison Schrager: You know, I'm noticing I'm becoming less open minded, and more reactionary to bad news. [laughter]
Mike Norton: So the just, it's a propagation of anger. It does seem a lot of the, when fake news spreads, it's often outrage. It's, there's something in it that's making people where they're so angry that they just throw it out, like, to everyone. And then that, partly was what drives this frenzy.
Allison Schrager: The anger feels good.
Mike Norton: Hmm.
Allison Schrager: It feels really good. Like, you see it. You realize everyone's wrong except for this like small group on Twitter who think like you. [laughter] And you know, you, you feel like you have this community that understands you while the rest of the world has just gone crazy.
Mike Norton: And then later you realize those are all bots and it's just you shouting, shouting at a bunch of computers, but it feels good still so,
Allison Schrager: It feels so good.
Mike Norton: Yeah.
Allison Schrager: I have my pension Twitter community, which is, like how I identify with all these groups I, I follow, I feel like I'm part of the pension Twitter community, which is like 15 people. And we get really incensed about pension discount rates and actuaries and how crazy actuaries are.
Mike Norton: I mean who doesn't, [laughter]
Allison Schrager: Yeah.
Mike Norton: In one sense, this intersection of groups could be fantastic, right? There's a lot of research on how, you know, you can improve decision making by getting different perspectives on an issue, and things like that. And yet it often seems like when we crowd a bunch of people into these domains, we ended up getting crazy decisions coming out of it. Like I'm, I'm thinking of course about IPOs, the way it seems like there's something that happens around some IPOs where all of these people crowding in with their opinions about IPOs, it can kind of explode, in, into something that may be is, it's not fake, but it's certainly surprising that this can happen.
Allison Schrager: I think generally a lot of economic indicators and variables are actually a function of people's expectations.
Mike Norton: Yeah. I think there's this naive view that the news just sort of reports on what's happened in the world and of course, in many domains, and I think especially in financial domains, the news can drive what happens in the financial world. The idea that something is popular because it's being talked about, well then you talk about it more and then it becomes more popular. The thing itself hasn't changed. It's just the conversation is changing and there's so much more buzz around it that now we start to make different inferences about how valuable it might be.
Allison Schrager: Yeah. And to some degree then it all becomes self-fulfilling and that's why you get sometimes these crazy IPOs where either a really good company crashes and burns, or a company that has no business being, you know, will just take off only to crash and burn later.
Mike Norton: So we definitely learned, Allison, that you're living in a weird echo chamber on social media. But it seems like all of us, in one way or another, are living in a weird echo chamber now, where bad information and overhyped information gets really, really amplified and not just goes viral, but also influences our judgment and choices that we make.
Allison Schrager: Yeah. Whether it's simply the decision to share that bad information with other people or sometimes, you know, we'll use this bad information to make a really poor investment decision.
Mike Norton: I wanted to learn more about the inner workings of why these things seem to influence us so much. And so I got a chance to sit down with two friends and colleagues of mine, Gordon Pennycook and David Rand, to talk about what's happening. Gord and Dave are both behavioral scientists. Gord's at the University of Regina in Canada. And Dave's here in Cambridge at MIT. And they recently coauthored a bunch of papers on the psychology of fake news.
Mike Norton: Welcome to the show, Gord and Dave.
Gordon Pennycook: Thank you.
David Rand: Great to be here.
Mike Norton: We're going to talk today about fake news, in general, and what underlies why we fall for fake news, and, and all the different cognitive biases that influence us. But Gord, I wanted to start with you because a few years ago you kind of started a little bit this idea of studying, "Why do people fall for such ridiculous things sometimes?" And you studied a phenomenon that you called a one word, which was,
Gordon Pennycook: "Bullshit"
Mike Norton: You said at first, so now I can say it, I think technically, technically speaking. We kind of know what bullshit is. It's like when people are saying nonsense and things, things like that. But it seems like you had something different in mind when you thought about, "What are the consequences of it?"
Gordon Pennycook: I was just hanging out and looking at things online. And I saw this website, the New Age Bullshit Generator, which takes a bunch of these kind of buzzwords like consciousness and uh, quantum and intentionality and whatever else. And it just puts them together randomly in a sentence. And I came across this really awesome example of what we refer to as basically, "pseudo-profound bullshit." So it's these kind of meaningless vague sentences that sound like they might be profound. And you know, because I'm a psychologist, I get to just see if people will actually think that those things are profound. And that's, you know, what we did in the study.
Mike Norton: What were some of the ones that got you thinking this is complete, sounds completely profound and yet is completely meaningless?
Gordon Pennycook: So one example is, "Hidden meaning transforms unparalleled abstract beauty." Uh, that's just, that sounds pretty good actually. If you, you know, if you just don't think about it.
Mike Norton: That one just changed my life.
Gordon Pennycook: It was just constructed literally without any concern for the truth, which is the definition of bullshit put forth by Harry Frankfurt.
Mike Norton: I have that as a, as a tattoo down my arm. Are you, [laughter] are you saying that I've made a huge mistake?
Gordon Pennycook: You've made a huge mistake.
Mike Norton: Okay.
Gordon Pennycook: Well, I mean it depends. It depends on how, like how floral is the tattoo?
Mike Norton: It's, it's pretty involved. There's like a cowhead in it, too. It doesn't matter. Anyway, carry on with what you were studying.
Gordon Pennycook: Yeah, yeah, so basically, we just give people these sentences that are, these random sentences, and we just asked them, "Do you think this is profound? Does it sound profound to you?" And, people who think that they're more profound, they're less analytic, more likely to, you know, believe in conspiracy theories and things of that nature.
Mike Norton: And there's sort of a general constellation of, there's some sort of underlying, "I kind of believe stuff that maybe isn't totally true" construct underneath it?
Gordon Pennycook: Yeah. What me and Dave had been talking about more recently is what we're kind of calling reflexive open-mindedness, which is some people are just way too open or willing to believe things that they come across in their everyday lives. They're intuitively kind of unskeptical. And so that leads to people to kind of believe this kind of pseudo-profound bullshit, but also things that we'll talk about later, which is things like fake news.
Mike Norton: What have you learned about the psychology of fake news? What makes us more likely? What can we do to combat it? Are there things that you've learned?
David Rand: The first observation that we made was that there is really a link between, you know, is finding these tweets, uh, profound and believing false headlines. I should say, the way we do all of these studies is we show people content as it would appear on social media. And we also use all actual stories. So some of them are false, and some of them are true, but we're not making up our own fake news.
Mike Norton: And are, are there attributes that we can sort of identify that make things go more or less out there?
David Rand: Novelty or surprisingness is a major driver of what makes things get shared. And that if you see it and you're just like, "Oh yeah, I knew that already," then no point in sharing it. But if you're like, "Wow. That's crazy. Like, I never would've thought that," that makes you more inclined to share it. And I think that that's super important because one of the core findings that Gord and I have been repeatedly observing is people that think more, and people that are sort of more careful, less likely to just go with their intuition, more likely to stop and, sort of engage in analytical thinking and critical thinking, and being like, "Does this make sense or not?" they're less likely to believe both blatantly false stories and also just kind of hyper-partisan misleading coverage of events that actually happened, regardless of whether they align with their ideology or not. Just like across the board, the people that engage in more thinking are less likely to believe the false content.
Mike Norton:. So one factor that kind of influences whether people believe this, is this idea of repetition, that sort of repetition, in and of itself, seems to produce more belief. Gord, can you kind of tell us about that research?
Gordon Pennycook: Repetition increases people's kind of judgments of whether something is true because it just makes it seem familiar. And we use familiarity as a kind of proxy for whether something is true. Like, "Oh, I've heard that before, so maybe it's true." And, and the kind of basic cognitive psychology is that the more that something is repeated, the more easily it's processed in your brain. So there was this awesome paper from 1977, and the ultimate kind of like, paradigm, was referred to as the Illusory Truth Effect. And so the original studies, what they would do, is they would take these trivia statements, that people had to essentially guess on, they didn't know the answer to. And they would just repeat them over sessions. And then the more that you repeat it, the more that people think that they kind of know the answer. What we did in our study was, you know, "Is this going to be true even for these egregiously false things?" The whole thing with fake news is that, you know, people are seeing these things on social media. But if you stop and think about it, they're kind of easy, relatively speaking, to kind of identify as being false, but do we still get this effect of repetition? And we do. Even a single prior exposure to a fake news headline is sufficient to later increase people's perceptions of their accuracy. And that's even if the headlines are inconsistent with their political ideology, which, you know, to us was pretty surprising.
Mike Norton: That's interesting because I think one view of what's happening with fake news is that it is, it's kind of because things are hyper-partisan that we, on one side, we believe one thing and we believe the fake news on our side. And then on the other side, we believe the fake news about the other side, kind of thing. But you guys are showing that more important is, is literally just, "Are we seeing this information a lot or not?" seems to play a big role in whether we believe it's true even when it doesn't agree with exactly our partisanship.
David Rand: Right.
Mike Norton: There's this is kind of sense that, "I, I have a feeling that I heard this somewhere," is, is easy for us. And then everything else is hard for us. Right?
Gordon Pennycook: Yeah.
Mike Norton: How many people did I hear it from? Was it credible or not? Was it labeled or not? We seem to just lose those pretty quickly. And then we're left with just this nebulous information that influences what we believe.
David Rand: There's a political scientist at MIT, and some coauthors, have cool work in that vein that when you correct misinformation, the memory for the correction fades faster than the memory for the original thing. So, like right off the bat, the correction looks kind of effective, but then you come back a week later and people forgot it.
Mike Norton: I'm thinking of a, a study that I love by Lisa Fazio where she convinces people that what Scottish people wear, it's not kilts, it's saris. And, at the beginning of the experiment, everyone knows that it's kilts. Right? Like, it's one of the, if we know anything about Scottish culture, we are aware that Scottish peo-, I think bagpipes would be the other one. It's kilts and bagpipes. And she's able to through just repetition get people to start second-guessing something that they, that they knew was true. So they're not, they're not even coming from a, "I'm not sure what's true." Even on these kind of fundamental facts, why is it that our minds, with just people prodding us a bit, we start to believe all sorts of things that really we knew five minutes ago weren't true.
Gordon Pennycook: We would be constantly overwhelmed with our environment if we weren't filtering things. And so it's built a lot on efficiency. And if you think about it that way, some of these results start making sense. So like, people knew at some level that a kilt is, is what you call the skirt that Scottish people wear. But in the context of this other effect where you can, like, repeat something, and it gives it a little sense of plausibility, now it's just kind of backfiring because unfortunately the world's too complicated for our meager little brains.
David Rand: But when you look at the overall pattern of results in all the studies that we've run, when you ask people to judge the accuracy of headlines, or when you ask people to say how much they trust different news sources, we got another study on that, people actually do, like, quite well. The true stories, they get a much higher truth rating than the false stories. And the mainstream outlets get a much higher rating than the fake news, or hyper-partisan outlets, even if they align with people's ideology. And I feel like one thing that has come out of our work is that if you ask people to stop and think about accuracy, they're actually like pretty good.
Mike Norton: Um-hmm.
David Rand: And I think that a lot of the problem, particularly in terms of social media and stuff getting shared on social media, is like, not that people are, you know, hopelessly blind and, you know, held in a little box by their partisanship and so on. But more that people just aren't really thinking about it. And so we've been doing new experiments now essentially nudging people to think about accuracy while they're on social media. And we find that it does make people, you know, substantially more discerning in their sharing. Taken together, the bulk of our results is like, a little bit more optimistic.
Mike Norton: But I'm thinking like in, in the financial space, if you think about people trying to predict things about the stock market, or something like that, yeah, we can have reasons for spreading misinformation. But also we seem just as susceptible to believing the crazy thing, and wanting to pass that along. Is it just that across domains, humans just seem to struggle a, a little bit?
David Rand: So I think that the, the financial case is an interesting one because I think the problem is harder. In the, in the same way, in the way that I was saying, like for the kinds of stuff that becomes popular on social media, that is selecting for crazy stuff that you can figure out probably isn't true if you think about it. My guess is that the signal is harder to pick out in the financial context.
Mike Norton: IPOs are the best example where suddenly something is worth 9 trillion dollars and everyone's incredibly excited about it. And obviously sometimes that information is true, but at least some of the time it feels like it's more buzz and hype, than accurate, and we should be kind of bringing our decision-making and our careful thinking to the task and really trying to understand, "What is this thing really worth?" And instead we often get swept away and we're completely lacking clear thinking. You'd think when it comes to financial decisions like investing, we'd be really rigorous in how we think about them. Why do you guys think, instead, we get so wrapped up in these things?
Gordon Pennycook: So let's think about what we talked about with, in terms, in the terms of, uh, fake news. And in fact, some of the evidence that we have in the context of that repetition effect is that the smarter you are, it doesn't really matter that much, you know. Basically everybody shows that effect. It's, it's kind of ubiquitous. And so in the context of these, like, important financial decisions where you should really care about accuracy, that doesn't mean that you will only care about accuracy. There's going to be other things that are gonna influence you. And it's gonna of course, depend on the context, motivation, wanting it to be true, you know, and thinking about only the positive things, all these biases are still evident. I, it doesn't mean that, you know, we're hopeless, but there's still going to be these, uh, these heuristics that we rely on because, you know, we just don't have infinite processing capacity in our brains, right?
Mike Norton: And then we have like, financial gurus, who have TV shows, and they are often over time shown to be wrong again and again and again. Why don't we start to think, "Maybe the fact that these people are saying these confident, profound things, maybe they actually don't know what they're talking about."
Gordon Pennycook: We want answers, you know. That's it. It's way too difficult to like, truly take into account somebody's quality of expertise. What you do instead is take into account the quality of what they're saying right now. I think our like, intuitive sense of eventually people get figured out, is probably wrong in that people are much better at faking it than we are at recognizing that people are faking it.
David Rand: People that like simple answers find gurus attractive because the guru gives you the simple answer and then you don't have to worry about it. And even if it turns out to be wrong, whatever. You didn't have to worry about it. Fear gets people to not think carefully and to sort of just go with their default, shut down, be defensive, whatever. And then when you think about things in the, in the financial context, similarly, like when there's like a market crash, you know, when things aren't going well, people get really scared and nervous and that makes them do things that perhaps they would not do if they were being a little bit more reflective about what's going on. People don't think through that because they're panicked.
Mike Norton: Yeah. And I think you, if you look at financial behavior, you know, in general, it's on the other side too, where when the market's doing well, then the excitement from that can completely mess up our decision making as well. So we tend to overreact to things in the environment when they happen, instead of trying to think, well what about later? Because later never feels as salient as right now. And the, so this, these emotions that you're talking about can really change our behavior in ways that later on we can't believe we did that because we're not that person anymore.
David Rand: Totally.
Gordon Pennycook: One way to think about it is that, like, how much thinking is required to be reasonable? The more emotional valence there is, the more reasoning you need to do to be reasonable.
Mike Norton: And we're kind of just inundated now with real and fake news and Twitter and whatever else it, it might be. Do you guys have thought, I'd like to hear from both of you on, for me today, what should I do to try to be a little bit better about not getting down the fake path, and staying more toward the accurate path?
Gordon Pennycook: The number one thing is to kind of recognize the weakness of your own brain.
Mike Norton: Well, my brain is mighty, but, so let's talk about an average person.
Gordon Pennycook: Yeah, exactly. But I mean that's, if you believe something that nobody else believes, you have to kind of think that you're smarter than everybody else, right? You're like, "Everyone else is stupid. I figured this out. Nobody else did." And so, overconfidence is a major problem, but recognizing the kind of weaknesses of your brain and that, you know, social media is not really conducive to thinking critically unless you make it that way. Uh, unless you kind of stop yourself and you, you know, engage with the medium of a bit more reflectively, you're going to fall prey to the same heuristics that everybody else does.
David Rand: You know, if you think about social media, it's sort of, by design, a space where it is hard to think analytically and critically about content. The news is intermixed with pictures of babies and cats doing crazy things, and whatever. This is like, you know, emotionally salient content.
Mike Norton: Did you mean babies doing crazy things? Or just babies by themselves? Or babies and cats together doing crazy things?
David Rand: It's all out there, Mike.
Mike Norton: It's all out there.
Gordon Pennycook: Yeah. In all, all combinations.
David Rand: Whatever you want, it's out there. Um, What I see as the take home from our research is that, if people bother to think about it, they can be pretty good. That, just, in general, we are not bothering to think about it.
Gordon Pennycook: "Hidden meaning transforms unparalleled abstract beauty."
Mike Norton: To me, that's it. I mean, that's how to solve fake news entirely. I want to thank both Gord and Dave [music starts] for joining us today. Thanks guys.
David Rand: Thanks so much, this was great.
Gordon Pennycook: Thanks man.
Allison Schrager: Mike, so my first thought is there's reasons to feel really pessimistic. I mean, after all, there are so many ways our brains are set up to be unreliable and so easily tricked, but it actually sounds like Gord and Dave are pretty optimistic.
Mike Norton: I had the same experience. At first I was deeply, deeply worried. And then by the end of chatting with them, I felt a little bit better, and also a little bit more empowered about what I can do to try to be a little bit better.
Allison Schrager: So what can you do?
Mike Norton: The first thing that struck me was this idea of being really sensitive to repetition and how it can lead us really far astray. Even really, really smart people, when we hear things over and over again, we tend to start believing them. They feel true just because we've heard them a lot. The good news is that their research shows that we're able to identify statements that are true and false, if we just take a little bit more time and say to ourselves, "Does this sound right?" instead of just saying, "Well, it's familiar. It must be right." It's especially critical when we're making really big decisions, like investment decisions, where the rule should actually be, "The more familiar something sounds to us, the more skeptical we should be."
Allison Schrager: Yeah, and this is true when we invest in anything, but especially an IPO where all you have often is a lot of hype and not a lot of data because that stock has not been traded before so all you get is the hype that you're gonna be in on something new, but you don't really know how that asset's really gonna perform on the market yet.
Mike Norton: That kind of reminds me of the dating market, actually, where there's basically mainly, mainly hype, and very little information, and you're still trying to make good decisions. [laughter] My second takeaway was this great phrase that Gord and Dave used, "Reflexive Open-mindedness." That it's human nature to be open and curious but we can't let our open-mindedness and curiosity cause us to doubt common sense or basic facts. This is like the cautionary tale of the kilt and the sari. Even the smartest among us can believe things that aren't true when we're being too open-minded.
Allison Schrager: Yeah, and this is especially true when we're making financial decisions. There's just so much information being thrown at us. And more and more, the skill is knowing what information is good and useful, and what is not. You get a lot of contradictory advice. Or you get a lot of data points thrown at you that don't really make sense. So you want to be really careful to be very critical about who your sources are, and make sure you can trust where the information is coming from.
Mike Norton: I think the third really broad thing that struck me is that we're not bad people because we fall victim to these things. It's that our brains are a little bit wired to fall victim to these things. A lot of the time familiar things are sort of true and okay, so we believe them, but sometimes they're not and we have to watch out for them.
Allison Schrager: Yeah, and I think especially when it comes to finance there's a lot of overconfidence. Everyone wants to believe that they're the ones who know what's accurate. So when it comes to making a financial decision, we have to humble ourselves and accept our limitations.
Mike Norton: And then finally, and clearly, most importantly, the biggest takeaway for me was, and I'm actually going to have t shirts printed with this on it, and we should all start living our lives by this, "Hidden meaning transforms unparalleled abstract beauty."
Allison Schrager: I would definitely take that t-shirt. [laughter]
Allison Schrager: Talking Green is an original podcast from OroTrade and T Brand Studio at The New York Times. Learn more about fake news and your finances at nytimes.com/talkinggreen.
Mike Norton: And subscribe to Talking Green now so you don't miss a single episode. Join me next time as I explore the connection between making good financial decisions and foraging for mushrooms.
CP: It's really hard to find them. But it is incredibly satisfying when you do find one and I've been doing it for about three years now, and I can tell you basically every single spot I've found one of those mushrooms because, when I did find a mushroom, my brain released dopamine, saying, "Ooh that's exciting! Worth remembering and doing that again."
Mike Norton: I'm Mike Norton.
Allison Schrager: And I'm Allison Schrager.
Mike Norton: Thanks for listening.