The first part of this episode is available to all listeners. To hear the whole thing, become a paid subscriber here.
Cognitive psychologist Gordon Pennycook explains the psychological reasons we fall for misinformation, conspiracy theories, and general bullshit (a technical term!) We discuss why people with an analytical cognitive style tend to be more skeptical of alternative medicine and health misinformation, some of the pitfalls of intuitive thinking (and why intuitive *eating* may actually be more of an analytical or deliberative process), why being skeptical of out-there wellness practices is actually a sign of open-mindedness, why even very smart people can fall for wellness misinformation, and more. Behind the paywall, we get into the difficulty of trusting experts in matters of health and wellness, the importance of thinking critically about science, the attention economy and how it contributes to incentivizing misinformation, how conspiracy theories have touched Gordon’s life, his surprising findings about what it takes for people to drop conspiracist beliefs, and the best ways to stop the spread of misinformation.
Gordon Pennycook is a Himan Brown Faculty Fellow and Associate Professor in the Department of Psychology at Cornell University. He obtained his PhD in Cognitive Psychology at the University of Waterloo in 2016 and held a Social Sciences and Humanities Research Council of Canada Banting Postdoctoral Fellowship at Yale University.
His expertise is human reasoning and decision-making, and he has published over 100 peer-reviewed articles, including in journals such as Nature and Science. He has published research on the spread of fake news and misinformation, as well as the first ever paper on the psychology of bullshit.
Gordon has received several awards, such as the Governor General’s Gold Medal, Poynter Institute’s International Fact-Checking Network “Researcher of the Year,” and early career awards from the Canadian Society for Brain, Behaviour, and Cognitive Science, the Psychonomic Society, and the Association for Psychological Science. He was elected to the Royal Society of Canada’s College of New Scholars, Artists, and Scientists in 2020.
Resources and References
Christy’s second book, The Wellness Trap: Break Free from Diet Culture, Disinformation, and Dubious Diagnoses and Find Your True Well-Being
Subscribe on Substack for extended interviews and more
Gordon’s website
Christy’s online course, Intuitive Eating Fundamentals
Transcript
Disclaimer: The below transcription is primarily rendered by AI, so errors may have occurred. The original audio file is available above.
Christy Harrison: So, Gordon, welcome to Rethinking Wellness. I'm so excited to talk with you.
Gordon Pennycook: It's my pleasure.
Christy Harrison: I'm really fascinated by your work in general. You have so many different findings on the link between cognitive styles and conspiracism, bullshit receptivity, misinformation. All of this is really fascinating to me, so I'd love to dive into that. But before we get into it, can you just tell us a bit about your background and how you came to do the work you do?
Gordon Pennycook: By training I'm a cognitive psychologist, an experimental psychologist. So I started at the University of Saskatchewan in Canada. I'm from Saskatchewan, which is the middle of the country, and I became really fascinated in why people believe what they believe. And so the first research I ever did was on just the nature of human reasoning. And that's kind of what my focus has been ever since. I did grad school at Waterloo in Ontario, and then I did a postdoc at Yale University, went back to Saskatchewan, and now I'm at Cornell University.
And all along I've been doing work on just basically the nature of human thought. What is it that leads some people to be really skeptical about belief in God, for example, or why people, some people fall for misinformation and why other people are really strong believers. And that seems to kind of pull down rabbit holes, as it were.
Christy Harrison: Do you have any background yourself being pulled down those rabbit holes or believing in misinformation at other times in your life? Or is it more what you've observed from other people?
Gordon Pennycook: Well, I've definitely believed in misinformation. I'm a hockey fan, I used to be really superstitious about where I would sit and things like that. I think we all kind of have these stories, but the thing that really led me into being fascinated about belief is I'm from a very small town called Carrot River, Saskatchewan, about a thousand people. And at some point, I just kind of stopped believing in God. I remember actually distinctly learning about evolution. I had a discussion with my mother. I said I was excited about these ideas.
And she said, "Well, you still believe in God, though, right?" And I was like, yeah. And then later on, I was asked why. To me, it seemed like a non-sequitur, but then I kind of thought about it, and then everything kind of unraveled from there. But there wasn't really anybody else around that apparently didn't believe in these things. And I just thought that was really weird and interesting, and I didn't know if I was weird or what that was about. Most psychologists will tell you this, that they end up going into psychology to try to understand themselves a bit better, but that's really what led me down this path.
Christy Harrison: Well, I'd love to talk a little bit about that research. So it sounds like maybe you are a more analytical type of person, perhaps in general. But can you talk a little bit about what your findings on cognitive styles have been and the difference between analytical and intuitive thinkers, especially?
Gordon Pennycook: Yeah, there's this really important distinction. So if I ask you what your name is or what two plus two is, you can answer without having to think. Essentially, it just pops into your head. But that happens also for more complex things. We have emotions that come from things where you see a fake news headline on social media, and these things might influence the way that you feel about something, and that's kind of like an intuition. But we can also stop and reflect about things. We can decide, in many cases, to think about something.
So a simple example, if I ask you what's 37 times 82? You have to kind of decide to make that decision, whether you want to solve that problem or not. That's up to you and so that's different. We can use that deliberation to understand or to even in some cases, override our intuitions to change our mind. And that's really kind of the core focus that I've had for a long time, which is how do we actually change our minds? How do we override?
Intuitions are often very accurate. I mean, if I ask you what your name is, you can answer that question. But sometimes they do lead us awry. Sometimes what we feel is actually not the right direction and will lead us astray. And we have to be able to stop and use our more rational faculties to give better answers. And that's one thing that varies a lot between people.
Christy Harrison: Yeah, well, it's so interesting. So there's a lot there to unpack. I just, as my background, I'm an intuitive eating dietitian. I'm a registered dietitian. My original background is in journalism. And I always wanted to be a journalist and writer and have been doing that for 21 years. But along the way, I also picked up this degree in dietetics. And originally it was driven by my own disordered eating and disordered thinking about food. Kind of like you being drawn to know more about yourself and psychology, I was drawn to dietetics to try to know more about how can I control this out of control feeling I have with food? How can I become a, quote unquote, healthier eater?
And I, along the way, I really discovered so much about disordered eating and the roots of that and how dieting and restriction can actually drive people to more disorder and feeling out of control with food and all of that. So it led me to a practice called intuitive eating, which is probably different than what the definition of intuition you're using, but it's more about getting back in touch with your body's cues, learning to feel hunger and fullness and satisfaction and trust your choices and not second guess yourself all the time and get down these diet rabbit holes, which are really so full of misinformation a lot of the time, and just kind of eat in a more balanced way and be at peace with food and not have it take up a huge amount of space in your life. So that's kind of my background.
But as I started to practice intuitive eating personally in my own recovery from disordered eating, I started to think about the role of intuition in other areas of life. And I think I'm someone who does tend to overanalyze and overthink a lot. And so I started to try to let myself be more intuitive and just feel my feelings a little more in other areas. And it was productive, I think, in the sense of recognizing when a relationship was not good for me or when I had certain feelings in a workplace and I realized that it wasn't a good fit.
Being mindful of intuition and especially feelings coming from my body and sort of anxieties that were showing up in different ways, that was helpful in making decisions. But I think what you're talking about with analytical thinking might encompass some of that, right? Of being deliberate about your choices and thinking about what's going on for you in your environment, and not necessarily just going with the first thing that pops into your head.
Gordon Pennycook: Being deliberative doesn't mean that you're not intuitive. What it means is that you're maybe being deliberative about how you're being intuitive, what information you're gaining from paying attention to the signals from your body. That's no different fundamentally from gathering more information from the world. And so the message of this kind of research is not ignore your intuitions, but understand them and think about them. And don't just rely blindly on them, but try to make sure that you are actually being mindful about how you're making your decisions. And in many cases, thinking about where they're coming from, in which case, you have to be in tune with your intuitions to do that accurately.
Christy Harrison: Can you say more about that? Because in the research, it seems like two camps. There's the people who are the analytical people and the people who are the intuitive thinkers. But in practice, I'm sure it's a real spectrum. Right? And the people who are more on the analytical side are not totally without intuition, but they're channeling their intuition, or they're understanding their intuition in a different way.
Gordon Pennycook: It's definitely a continuum. We actually don't know if some people are actually, strictly speaking, more intuitive than others in the sense that it would have to be you have stronger intuition about something. That always depends on what the context is. If I play a lot of chess, I've got really strong intuitions about how to do the next move. That's not because I'm more intuitive, it's because I know about chess. So it's not about me, it's about the thing that I'm doing. So really, most of the variation comes in how much are you willing to expand effort thinking about something, essentially.
And people who are more deliberative, they still have intuitions and feelings and emotions, and they have thoughts about things that come to mind. In fact, that's the kind of building blocks for our deliberation. The way that we define intuition is something that basically pops into your head automatically. We need those things. You need to interpret the world essentially, but in order to make, in some cases, accurate choices, you need to know when to stop and to think. You have to know is this the case when I need to kind of think about what's going on here.
So ultimately, it comes down essentially to overconfidence, knowing when you know something and knowing when you don't know something and knowing when you need to stop and think. And so it's what we talk about is metacognition, thinking about your own thinking. And which, people would talk about this in terms of these words like mindfulness. A lot of the research on that is not that great, but it has more to do with just thinking about the way that you're thinking, being aware of, is this a good decision? Am I doing this based on good reasons? And that sounds like what you're talking about in terms of this intuitive eating, which is being aware of the signals that your body is giving you. Those are important things to pay attention to.
Christy Harrison: Yeah, that's really interesting. And also kind of like learning when to turn off the overthinking and when it's sort of doing you more harm than good to be obsessive about everything you're eating and tracking all your calories and all that stuff.
Gordon Pennycook: That's thinking about how you're thinking also. "Am I overthinking?" This is taking one step back and being asking how am I actually thinking about this? It's thinking about your own cognition. So that's really the kind of underlying important element.
Christy Harrison: That's really fascinating because the research, it seems like there's a lot of associations with analytical cognitive style and skepticism and I'd love to talk a little bit about that. Sounds like analytical thinking is also about deciding to value different things or deciding when to go deeper and when to sort of go with your first instinct. Maybe there's something around values there, too, of what you think is important when.
Gordon Pennycook: The things that you think are important are going to be the things that you are more motivated to spend your time getting right and thinking about. But there are countervailing cases. And so there's another element of thinking styles that we measure called actively open minded thinking. So it's not just how much time are you spending deliberating about something, it's are you actually questioning your prior beliefs, questioning yourself? Are you using new evidence to update things to be true?
In many cases, the things that you care about the most are going to be those things that you are less willing to really kind of question. And so that's where you see some of the biggest differences, like people who are more active, open minded, much less likely to believe in superstitions, for example, or to engage in remedies that are not based in evidence and science and so on. So that's a pretty consequential distinction. It's not just about how much you think, but it's really how you think.
Christy Harrison: That's so interesting, because open mindedness, in sort of lay parlance is sometimes used as, like, I am very scientifically skeptical, and I spend a lot of time unpacking and refuting misinformation around health and nutrition. And I mean very much on the side of evidence based and science based medicine and often sort of criticizing various elements of alternative medicine. And one thing that I and other science communicators in this space have often been accused of is not being open minded, not being open minded to alternative remedies or the sort of mystical realm or something like that. What would you say to that based on your research?
Gordon Pennycook: I get the same thing because I try to bring evidence to all things that bear on my life, including my relatives and stuff like that. Sometimes it's more fun to just be like, "Yeah, right on. That sounds like a cool idea. Let's just leave everything." But you don't want to be so open minded that your brain falls out is the classic addiction. So the problem is this that people conflate open mindedness with acceptance. And if you are looking at some sort of claim and then really evaluating it and looking for evidence, that's more open minded than just accepting it blindly. You're actually trying to understand what the thing is and what works and whether it's legitimate. That, to me, is really what it means to be open minded.
Open people think about it in context of just being willing to accept. But that's not really open mindedness. That's actually a sort of form of closed mindedness. You're allowing yourself to be pushed around by whatever it is that people are telling you or whatever is going on in the world and not filtering that information to make better choices. And so that's why we call it active open mindedness. It's really about the process of trying to get it right.
Christy Harrison: Yeah, that makes a lot of sense. It's different than the sort of traditional conception of open mindedness, but I can see how that traditional conception is just more about not questioning your priors, really, and going along with, or maybe even sometimes motivated reasoning.
Gordon Pennycook: And I think part of it is that when you say, "I don't think that this thing that you think is true is true," it's an easy thing to say "It's because you're not open minded," meaning that you haven't really thought about it. But if you can then say ut I actually did a bunch of this research and I have thought deeply about it and these are the reasons why I don't think it's true, that doesn't really square with this idea that you're not being open minded about it. Of course, the person will just say you didn't really look at it the right way, and then they can leave the conversation. But that's just an easy out, unfortunately. But that's communication.
Christy Harrison: So you and fellow researchers have found that people with an analytical cognitive style, which really sounds like a deliberative cognitive style in what you're explaining here, tend to be more skeptical of alternative medicine, which, again, is less evidence based and more pseudoscientific often. And they're also more skeptical of COVID misinformation, including false claims about alternative "cures" and conspiracy theories about the disease. Can you talk a bit about these findings? I'm so curious how analytical thinking plays into this.
Gordon Pennycook: It guards you against the things that feel right but aren't, that appeal to certain elements of. It could be some combination of prior beliefs or in the case of conspiracies, some elements of people's paranoia or distrust in the government or in whatever sources of power happen to be causing the conspiracy to occur. So these things have an intuitive appeal, essentially that they are, in many cases, in the case of misinformation, are explicitly constructed to appeal to people's intuitions. And so having the capacity to step back and to question them is really important. And this actually is even more important in certain contexts. So, for example, think about how social media operates. It's an attention economy where there's lots of flowing streams of information, and if you want to create misinformation, you need first to grab people's attention, to get them to stop scrolling, to look at the thing that you're creating.
And so how are you going to do that? Well, you're going to make it something that makes people scared or draws on certain emotions. And then if we're acting in a kind of automatic, what we call it, reflexive or intuitive way, then you're just going to share that with other people. And that's how misinformation spreads. But people who are more kind of reflective in how they are engaging with the world might stop for a second and say wait, is that actually true? We've actually done these experiments where we know that when it comes to sharing this information on social media, some people share things that are false that they would be able to recognize as being false if they thought about it.
So we have an experiment where we have people either they say what they're going to share and we give them like a list of true and false news and they ask, "Would you share this one? Would you say this one?" And then another condition, we ask them, "Hey, what do you think is true?" And people actually share more false things than are believed by the people who are rating the accuracy. So people are sharing things that they would be able to identify as being false because they're not thinking about it. Because when you're sharing something on social media, you're thinking, is this important? How does it make me look? Are people going to like this? Whatever. And you're not really asking is this true. So you have to kind of stop and do that extra step.
That's true for lots of things in the way that we engage in the world, we have to actually take the extra step. And a lot of people, I mean, everyone doesn't. I'm certainly not reflective all the time, but we have to kind of be vigilant about these things if we want to make sure that we're engaging in a more accurate or truth based way in the world.
Christy Harrison: You raised the point thar there's just this constant stream of information and we're also busy all the time. People are checking their phones and their social media in between things and on a quick break or something like that. They're maybe nothing going to devote a lot of time to it or dive deep down a fact finding mission or whatever, or sifting through the facts to make sure it's not misinformation. They're just like, "Oop, got to share this. This seems important. I'm going to spread the word about this" or whatever. And then there's also the attention economy and the way that bad actors will harness that or people who maybe aren't even necessarily mal intentioned but just believe what they're selling, believe their own bullshit or whatever and are selling misinformation and spreading it around and creating communities and economies around it.
I think about the anti-vax economy, for example. And I think in that sense, there's even a heightened feeling of "this is going to save people. I have to warn people. This is important information that I have to get out there." There's sort of a vigilante kind of feel to it, almost.
Gordon Pennycook: Totally. And one thing I want to make clear about this is that it's nothing about blaming people. In fact, in a certain sense, it helps us understand more about the behavior. This is something that we all do in many cases. Like we've already said, our intuitions are highly accurate, and so it makes total sense that we're going to listen to them. But sometimes we have to stop and question them too. When we hear people discussing someone who's maybe spreading a lot of misinformation, they might think they lack intelligence or whatever, but in a lot of the ways, it's not really necessarily about that. In many cases, it's more about they're doing things that we all do, relying on their intuitions.
It's not that people are being malicious necessarily when they're sharing misinformation. Sometimes they're just falling prey to the same things that we all do, which is that we had a long day, we go on Facebook for entertainment, we see this thing that seems really important, and we share it without thinking that much about it. That's a very human thing to do. So it's not really about casting aspersions, it's about trying to understand how these things happen so that we can try to be better when we're engaging online.
Christy Harrison: Yeah, I really appreciate that framing, and that's a very compassionate way of viewing things, which I try to approach things with a lot of compassion, too, because I have also been down these rabbit holes and believed things that were really untrue and had my dabblings in pseudoscience and alternative medicine and all of that as well. And I think its really interesting because I consider myself a smart person. I've always done well in school and all that, and never thought that I had sort of a cognitive reason for believing in things that were untrue or whatever.
You had some findings that found people who share the same cognitive ability can acquire very different sets of beliefs about the world if they differ in their propensity to think analytically, which that really seems to get at the heart of it. I think that's really interesting. I'm working on a piece about why smart people fall for wellness misinformation and that strikes me as one of the possible reasons, maybe one of the central reasons. Can you talk a bit about that, about your findings around cognitive ability versus analytical thinking style or intuitive thinking style, and why it's more about this thinking style in terms of protecting people against misinformation.
Gordon Pennycook: So cognitive ability is basically your ability to kind of understand complex things. And I guess an analogy would be imagine my grandmother, who has Lamborghini, that's the cognitive ability. It's got a big engine, it can go really fast. If she had it, it wouldn't be going very fast now would it? In this analogy, she's not going to drive very fast. The gas pedal is kind of your thinking style. To what extent are you taking advantage of the abilities or using them? You could be extremely smart when it comes to memorizing things. You have a strong working memory, or you're really good at physics or math or whatever, but if you're not willing to stop and think about things, then you're not really going to make use of a lot of those cognitive abilities. That's kind of the lever in many cases.
This is why we think about it in terms of the importance of thinking styles. Someone who's extremely intelligent, but is not really thinking critically or questioning things, they're not really using their intelligence in such a way that it's going to help them really navigate the world based on evidence. It'll help them in some contexts when they know they need to think about something. But knowing when to think is actually the important kind of underlying capacity.
Christy Harrison: That's really interesting. And I'm struck by in my world of wellness and health, and kind of seeing the misinformation that tricks people online and that people buy into and that I once bought into as well. I think theres a sense in which people dont always use their cognitive ability towards that, even though health and wellbeing is so important and essential. And there are various reasons for that too.
I think about my own experience and how I was dealing with a lot of stuff that was undiagnosed or misdiagnosed in the conventional healthcare system and feeling dismissed a lot of the time, feeling unheard, feeling frustrated that I wasnt getting help or even a label for what was going on for me. And I think that led me to feel like, "Okay, well, let me just try this. Let me just look over here." Even though there were some red flags for the things I was trying, I was like, "Well, what do I have to lose?" I think that was one reason, maybe, that it sort of made me less analytical and less skeptical about those sorts of approaches.
Gordon Pennycook: That's right. We have these things that block us from really kind of taking advantage of our abilities to reason, but they're still there. It's not like we suddenly lose the faculty to think things through. It's just that sometimes it's going to be difficult, or we're not really prepared in that moment to really question something, and this happens. People are facing loss or whatever, totally justifiable scenarios where someone's not ready to spend the time to really question this, but we can eventually, if we have the capacity to. So it's just a matter of getting ourselves there and making that hard choice.
Christy Harrison: There's another study that seemed like it had some different or overlapping findings that you and a team found that a person who's susceptible to online misinformation about one health topic may also be susceptible to many types of health misinformation, and also that those who were more susceptible had less education and less health literacy, less healthcare trust, and more positive attitudes toward alternative medicine. I think that's interesting. Especially the healthcare trust piece is sort of related to what I was just talking about. And then the education and health literacy piece.
At first glance, it might seem contradictory to the findings around analytical thinking, where it's not really about cognitive ability, but about this thinking style. But maybe it's not necessarily about cognitive ability, but education, which feels related in some way to cognitive ability. So can you just shed a little light on those findings?
Gordon Pennycook: So, knowledge in education is like an outcome that emerges from our underlying capacities to reason and our dispositions to think. Education will lead people to gain knowledge, that increases if the person is more analytical because they're thinking more about what they're engaging with, and that helps them learn it. So these things are interconnected, and when it comes to health literacy and all that kind of stuff.
One other element is that you might call it curiosity, I guess, wanting to gain information. This is a little bit separate from the kind of things I'm talking about. Also very important, just wanting to know how the world works and being fascinated, I guess, is something that people might call open mindedness in a different sort of sense. That's also really important, just being curious and wanting to know how to figure things out. Another very important element of what leads people to gain information. Of course, ideally, you have that and also some critical faculties to sift through the good and bad stuff, obviously.
Christy Harrison: Because curiosity without that could lead to the dregs of the Internet in some ways.
Gordon Pennycook: Right, now you're down the rabbit hole.
Christy Harrison: Yeah, exactly. The other piece that's really interesting to me about that is the positive attitudes toward alternative medicine. Can you talk about what you found around alternative medicine and cognitive styles? Sort of the link between belief in alternative medicine and other forms of misinformation.
Gordon Pennycook: It's kind of like we had to contrast with this kind of analytic thinking thing. Another term that we sometimes use is called reflexive open mindedness, which is kind of the thing that we talked about, which is being so open minded that your brain falls out. I mean, not to, again, I said earlier not to cast aspersions, but it's just a way to frame it, which basically just means you're kind of accepting the information that's coming into you, whatever the quality of that. And that will lead to a certain kind of confluence of beliefs, meaning that in some cases, they're almost even contradictory beliefs. There's some studies showing that people who believe conspiracies will sometimes believe conspiracies that completely contradict each other, like they're alternative explanations for the same phenomena, because they're kind of willing to accept either of them in any particular scenario.
The same is true for alternative medicines and this is a case where there are so many different potential remedies for the same sort of underlying issue, they'll accept all of them as being effective, even though they might even be kind of inherently contradictory and so on. It ultimately comes down with just kind of blindly accepting things and believing in them, which is just not a good strategy, particularly in a very complicated, intense information environment and one where people are taking advantage of others by creating essentially snake oil to sell and to profit off.
Christy Harrison: Totally. I'm not sure if you're familiar with some older research that shows postmodern values predict greater acceptance and use of alternative medicine. I found that recently, and I was really interested in that. There's one study from 1998 that found both people who held postmodern values and those who were dissatisfied by interactions with their doctors were more likely to have a positive attitude toward alternative medicine. But the postmodern values were a far bigger factor, actually, more than 19 times higher. Have you seen research to that effect, and what do you think of it? How does that square with what your research has found?
Gordon Pennycook: I'm actually not aware of that one. That sounds interesting. I want you to send that to me. It does sound a lot like basically the converse of actively open minded thinking, in the sense that the kind of postmodern worldview is that we all have our own truth, and therefore evidence is not relevant. It doesn't matter what the evidence is, because we define truth for ourselves, and therefore, that justifies any belief that you happen to have. Very convenient.
Whereas this actively open minded thinking is about literally just the opposite, which is questioning yourself. Am I actually right about this? What external evidence can I bring to bear to justify what I believe or to undermine what I believe? I think it ultimately comes down to overconfidence. Are you willing to say. Are you willing to say that this is the answer? And I'm confident in my ability to determine that. And will I proceed because I know the truth and other people don't? Even if all the scientists say it's not true, I think I know the truth. And that, to me, is ultimately about overconfidence.
Christy Harrison: The antidote to that would seem to be intellectual humility or continuing to ask question, not from this place of "Question everything. Everybody's out to get you" sort of conspiracist type asking questions, but just trying to learn more information from sources that would seem to hold greater access to the truth, maybe.
Gordon Pennycook: And not just that, but questioning yourself. The most important thing is do I really know? And here's an interesting kind of element of this, which is thinking about experts. So in a standard workplace everybody trusts experts, actually. If you're at work, a complex problem that you're facing, and there's somebody that is a colleague of yours that knows how to do the thing, you ask the person. There's just no question about it. If you have a technical problem, if on this podcast or whatever, we probably would have to get somebody in to help us solve that problem. We do this in that we have a very fragmented kind of intellectual environment where people have very specialized skills and we need all of those things to work together to do things in this complex environment.
But then it comes to health or science or climate change or whatever the thing is. And suddenly we don't listen to the experts there. Suddenly we say, "No, actually, I'm the one who knows the truth. And the people who dedicate their entire lives to try to understand this thing, they don't really understand it but I've done my research in 15 minutes here on Google, and I think I know." So that we know people are selective when it comes to which experts we listen to and this is a big problem. We have to understand what expertise really means, that you don't know everything and other people know more than you, and it makes sense to listen to them if they know more than you.
Christy Harrison: Why do you think it is that people have come to mistrust experts so much in matters of health and science and things like that?
Gordon Pennycook: So part of the problem comes down to what counts as an expert. That has been really complicated