Rethinking Wellness
Rethinking Wellness
What You Don't Know and Why It Matters with Timothy Caulfield
Preview
0:00
-40:13

What You Don't Know and Why It Matters with Timothy Caulfield

The first part of this episode is available to all listeners. To hear the whole thing, become a paid subscriber here.

Health-misinformation researcher and science communicator Timothy Caulfield returns to discuss his new book THE CERTAINTY ILLUSION, why being too certain about anything makes us vulnerable to misinformation, how intellectual humility can help protect us, why science is sometimes “full of shit” and how to be a critical consumer of it, and more. Behind the paywall, we get into why it’s so hard for public figures to show intellectual humility, whether being smart makes people less intellectually humble and more vulnerable to misinformation, the role of narcissism in misinformation belief, the Dunning-Kruger effect, and why so many researchers lie about their work. Plus, Christy asks Tim for advice on how to navigate an alternative-medicine recommendation for IVF, and whether refusing to do it is a hill she wants to die on.

Timothy Caulfield is a Professor in the Faculty of Law and the School of Public Health, and Research Director of the Health Law Institute at the University of Alberta. He was the Canada Research Chair in Health Law and Policy for over 20 years (2002 - 2023). His interdisciplinary research on topics like stem cells, genetics, research ethics, the public representations of science, and public health policy has allowed him to publish almost 400 academic articles. He has won numerous academic, science communication, and writing awards, and is a Member of the Order Canada and a Fellow of the Royal Society of Canada, the Canadian Academy of Health Sciences, and the Committee for Skeptical Inquiry. He contributes frequently to the popular press and is the author of national bestsellers: The Cure for Everything: Untangling the Twisted Messages about Health, Fitness and Happiness (Penguin 2012) and Is Gwyneth Paltrow Wrong About Everything?: When Celebrity Culture and Science Clash (Penguin 2015), and Relax, Dammit!: A User’s Guide to the Age of Anxiety (Penguin Random House, 2020). His most recent book is The Certainty Illusion: What You Don’t Know and Why It Matters (Penguin Random House, 2025; Bookshop affiliate link). Caulfield is also the co-founder of the science engagement initiative #ScienceUpFirst and has written, hosted and produced documentaries, including the award-winning TV show, A User’s Guide to Cheating Death, which has been shown in over 60 countries, including streaming on Netflix in North America.

Resources and References

Contains affiliate links to Bookshop.org, where I earn a small commission for any purchases made.


Transcript

Disclaimer: The below transcription is primarily rendered by AI, so errors may have occurred. The original audio file is available above.

Christy Harrison: Here's my conversation with Timothy Caulfield. You were first on the podcast a little over a year ago. Can you catch us up on what you've been up to since then?

Timothy Caulfield: It's been a busy year for sure, both good and bad. So let's start with the good stuff. Our team continues to do a lot of really interesting research and empirical research on misinformation, on health misinformation and how health issues are represented in the public sphere. We've done some really, I think, fascinating work on things like cancer misinformation. For example, we recently published a study looking at how cancer books fare on Amazon. What are the representations there? We found that if you get a cancer diagnosis and go to Amazon to get a book about cancer, which is a totally reasonable thing to do, we found that that search result will push out about 50% nonsense. So about 50%/49% of the books that are returned in an Amazon search have some degree of misleading information.

And listen to this, 70% of the first page has misleading cancer information! So we've been doing a lot of research on that, really highlighting the degree to which health misinformation is everywhere. Another thing that happened is I finished another documentary for CBC on the manosphere, the rise of this idea that we need more masculine, manly men, and really that means more traditional masculine norms. And somehow this is going to make us healthier and happier and more optimized. And of course, Christy, this had a really big impact even on the election in the United States. So that was on CBC in October, and it's still streaming on CBC Gem right now.

I was really proud of how that turned out. And then of most recently, I published a new book called The Certainty Illusion that really explores how our knowledge environment is crumbling. I make the pitch that we're kind of in a knowledge production and dissemination crisis right now. And I talk about the forces that got us to this place and of course, what we can do to climb out of the hole.

Christy Harrison: Yeah. So important. And it's a great book. I really encourage people to check it out. Interesting that you write a book about certainty and the benefits of not knowing or of believing that you don't know. Why did you want to write a book about not knowing when talking about misinformation?

Timothy Caulfield: Well, I think that our information environment is so chaotic right now. It's just so noisy that the idea of certainty has become seductive. It's become almost like a marketing tool. And there's so many paradoxes here. There's so many paradoxes at a time when you think we'd have more certainty than ever because we have access to more information than ever before, right? And to good information. We could Google the New England Journal of Medicine right now and look at abstracts together and Lancet and on and on, right? But despite that, we've never been more confused, more polarized, and to some degree more uncertain.

And here's another paradox I talk about in the book is they market certainty, they market answers, but at the same time they profit from the chaos, from the uncertainty, from the fear mongering and I wanted to explore all of those paradoxes. In the health space, I have a whole section on what I call The Goodness Illusion and this is the idea of we all want to do what's right for ourselves. We want to do what's right for our families and our communities. And I call that goodness, right?

And the best example of that which is very relevant to your world are health halos and phrases and words like natural and toxin free or chemical free or GMO free, these are shortcuts to certainty. And that's the promise. In this chaotic information environment, you want to do what's right for yourself and your family. And those words are signals to a shortcut, to making the right decision. And they're a promise of certainty, despite the fact that almost always, you know this as well as I do, that health halo does not fairly represent the actual science, right? About non-GMO, about organic, which at a minimum is more complex than the label the health halo wants us to think. And often it's the complete opposite of what the health halo promises. So I wanted to explore all of that and how we got here and what we can do about it.

Christy Harrison: Yeah, that is so important. You make the point that people are really desperate for accurate information in so many ways. And in the realm of science, this unbiased clarity that science provides or is supposed to provide is really seductive and really appealing because people are looking for that, looking for something to cut through the noise and science seems to be just that sort of instrument that can do that. And yet you make the point that science has become devalued precisely because it is so valuable and needed. I think that's a really interesting point. And can you talk a little bit about why that is and unpack this idea that science has become devalued precisely because people depend on it so much?

Timothy Caulfield: Yeah. Another paradox. At the beginning of the book, I joke that the Enlightenment has won, but mostly as a brand, right? Not in substance. We live in a world where pseudoscience and health misinformation is thriving. So it seems like, how can you say the Enlightenment has won? Well, it's won as a brand. People use science language, they use sciency sounding arguments to support their claims precisely because people know that this is how we make assessments of our world. Science is a tool for certainty, to nudge us closer to certainty. So I call it scienceploitation. Where people use real science, exciting science, scientific terms in order to push nonsense, pseudoscience, agendas or to create brands.

And this has become an incredibly common phenomenon. No one says my product has less science. No one says my idea is not based on any evidence. They present science to support their claims. And then of course, the paradox is the more that happens, the more it devalues actual science, I think it's one of the reasons we're seeing a decline in trust in science. Science and scientific institutions remain relatively trusted by the world, but trust in those institutions is declining. And in fact, often the voices that are using sciency terminology to support their claims at the same time often attack conventional science because they want to make room for their worldview.

So at the exact same moment they're using sciency terms to support their claims, they're also attacking conventional scientific institutions. And of course, your listeners will know probably immediately the kind of rhetoric I'm talking about. You see it in the anti-vax community a lot where they'll use marginal papers or preprints or completely debunked scientific ideas to support their sciency idea, while at the same time discrediting actual science, actual scientific institutions and often hundreds of millions of data points on safety and efficacy. And they want it both ways, right? They want their science to be credible and persuasive, but real science to be basically decommissioned in a way. And unfortunately, the research tells us that this strategy is incredibly effective.

I joke at the start of the book, the word quantum, you just think how often the word quantum is rolled out and often it's just stuck on a label or it's just referenced in the description of why this bogus product works. Even though, of course, there's no evidence that it has anything to do with quantum physics, but it just feels sciency. I think one of the best examples is the phrase "stem cells." We've done actually a lot of research on stem cell hype. It's everywhere. It's on shampoo bottles. We're doing a study right now on stem cell supplements. It's absolutely everywhere. And people just have this vague idea that stem cells are cutting edge. People talked about how exciting they were for so long and et cetera, et cetera. And it really does work, that strategy, but it does so much harm.

Christy Harrison: Yeah, we talked last episode about gut hype and all the hype around probiotics and the microbiome and that's another thing we see slapped on everything is "support your microbiome," your gut health. It's probiotic shampoo, it's all these things that really don't have anything to do with the gut. But the word probiotic is being used to sell things that are further and further afield of what the original research entailed.

Timothy Caulfield: You're so right. And that's, I think, one of the best recent examples. I think we talked about this last time, if we went in a time machine and went back seven years, how many people would know what the microbiome even was? And now it is on shampoo bottles. It is incredible.

Christy Harrison: Yeah, it's wild. I think this goes back to something that we may have discussed in the first episode, but I thought about a lot when I was reading your book, which is the idea of having a broad trust in science vs. a critical analysis of science, right? And thinking of science as a method or a tool versus thinking of science as the answer or the set of answers or something. And I talk a lot here about study design, certainty of evidence, how to analyze a study, how to look at it critically in the context of other research, all of that stuff.

But one thing I haven't talked about nearly enough is predatory journals and the fraud that goes on in science. And you have some interesting information about that in the book. Can you explain what predatory journals are and why they're so insidious? Because I think this also connects to the anti-vax research that's coming out. That paper published recently in a totally bogus journal that was from my understanding, actually just a WordPress blog. Not even a journal. There's bogus "journals" out there that are showing the veneer of science. They make people think it's actually peer reviewed scientific research when it's really not.

Timothy Caulfield: Yeah. And this is, I think, one of the biggest challenges of our time. And I know this sounds like hyperbole, but I mean it. And the challenge I'm referring to is the pollution of our knowledge base, right? The pollution of the scientific literature with either fully bunk journals or journals of questionable quality. Predatory journals are probably the largest phenomenon in this space. And these are very low quality journals that have either no or questionable peer review. They either have no or questionable editorial boards. Basically, to overgeneralize, you can pay to publish there.

And if you are an emerging scholar and you want to get your work out, it can be very seductive. There's such pressure on the academic community to publish that sometimes individuals might publish in these journals and not even realize they're predatory journals. I get invites every single day from one of these journals and they can look very legit. They're not and unfortunately we're seeing predatory journals, these very low quality journals, really polluting not just the databases and the academic literature more broadly, but even things like systematic reviews, they're popping up in them. And that's because it takes effort to really dig into is this a good journal, is this not a good journal?

And so let's say you are an individual that's quite keen and wants to make sure you're only looking at the best stuff. So you see a social media post about some claim coffee cures cancer and then you look at the press release and that looks real, the press release looks impressive. And then you click on the hyperlink to the study and who does that, Christy? Who even puts that much work in? Right? Then you click on the hyperlink to the study and it takes you to something that looks legitimate. And most people, I think, would probably just maybe read the abstract or scan it. Okay. There's a study out there that supports this when in reality everything I just described was bunk. It was a misleading social media post, it was a misleading press release that may was probably created by AI and then it was a bunk study.

That's where we are now. We know that anti-vaxxers use these journals to push nonsense. The one that you referred to, Christy, is a really good example. This was RFK Jr. referring to this study in his Senate confirmation hearing. This is where we are. The high level of pop culture conversation that we're having. He refers to it in the senate confirmation hearing. If you went so far as to look at that journal, like okay, the RFK Jr. referenced this study, let's take a look at it. And you go to it and it's really nothing more than a blog post. It was an anti-vax fanzine, but it looks like a real study.

And then if you look at the authors, they're associated with some research institute, both of them. And then you click on that research institute and it looks kind of real. Now I dug into that research institute. There's almost no information on. It's entirely mysterious. And then I went even further, Christy. I actually used Google Maps to locate it and it's some person's house in the middle of nowhere, just to give you a sense of how random this is.

Christy Harrison: Yeah, no, it's totally wild. This house in the middle of the woods, nothing to do with a research institute. But yeah, how many people are actually going through all those steps? Because how many of us even have the time to do that? You and I, this is our job. So we can go that extra mile sometimes when we have the time. But who among us, who doesn't have that as their full time or part time job, has the time, the energy, the bandwidth to do that amidst all the other things they have going on in their life? It's just impossible to keep up with it all.

Timothy Caulfield: You're 100% correct. And in the book I even joke about how while writing the book I almost fell for predatory journals. And I think this is important to highlight because this can impact all of us. We have to be humble about this and be vigilant. I'm doing research on a topic and I find a conclusion that really speaks to my view or really speaks to my preconceived notions about the topic and think that looks like a really interesting study. And then you read the abstract and you get excited and then you see, oh, it's published in this journal.

And I get it, right? If you find something that speaks to your preconceived notions, you're more likely to embrace it, more likely to share it, and less likely maybe to be critical of where it was published, right? And so this can impact all of us. This isn't to be critical of individuals. This is my job, right? Most people are just too busy in their lives. Let's say you're a family physician or a nurse practitioner, how much time do you have to do that with every claim that's supported by a study?

Christy Harrison: It's just impossible to keep up with it all. And there's no simple heuristic, right? That's the hard thing, because we all want simple heuristics. That's your point in the book is that's why we're in this mess, is that we're looking for something to cut through the noise simply and easily. And it's often not that simple. There's no black and white answers to a lot of things. So how do you, knowing that there is no simple, easy black and white way to determine this, how do you tell a predatory journal from one that's more legit? Because I find too that there's a lot of gray area where something will be on Beall's List of predatory journals, which has been roundly criticized by people for not being nuanced enough, and then there's open access journals, which I think open access is a great theory and model for disseminating science more widely. And I think it's important.

And yet some open access journals are really bogus and some are pretty good. And there are many, it's my understanding, that charge a fee to be open access and researchers will have to pay to be in the journal. So how is that that much different than the predatory ones that charge without any peer review and stuff like that?

Timothy Caulfield: It is incredibly challenging. And yet another paradox has emerged because the scientific literature has gotten so messy and so polluted that one of the best ways to cut through the noise is to turn to topic experts, subject experts who know what the body of evidence says in an area. So in other words, this situation is made the subject expert more valuable. But this is happening at the exact moment when experts are being demonized and being torn down and the world's being told that you can't trust experts. So that's really problematic.

The other thing that individuals can do is not fall for the single study syndrome, right? So if you see a study and it runs counter to the body of evidence or conventional wisdom, at a minimum, there should be a red flag up. So I think that that is useful. And for most of the topics that individuals care a lot about, I think individuals have a vague notion of what the body of evidence says and what the scientific consensus is, whether it's about climate change or diet or vaccines. I think people have a general sense. So if you see something that sounds extremely contrarian, I think it should be a red flag. Contrarian things happen and then they become true. It's just a red flag to dig deeper. So that's something that you can do.

The other thing you can do is you can see to what degree the study has been referenced by other researchers. Now, that's not a slam dunk because predatory publications do get referenced, and I talk about that in the book. But in general, the good and legit stuff is referenced more. And that may sound like it's a tough thing to do, but you can go to Google Scholar and just get a sense of how people are responding to that literature, if this is something that you feel really passionate about.

But that idea of leaning into the body of evidence is, I think, one of the safest strategies, especially for the big topics that matter most to us. And no, this does not mean group think. No, this does not mean giving into the meta narrative pushed by someone. This is really about thousands of independent studies that point in a particular direction, recognizing that good science is always evolving and always contested, just recognizing that there's a body of evidence out there.

You know this better than I do, almost never will a single study change how we deal with a topic, especially in the health space because humans are complicated. The environment that we interact with is complicated. There's so many variables at play. It's almost always going to take a body of evidence to change how we view the science on a topic. Would you agree with that?

Christy Harrison: I think so, yeah. I'm thinking about my own perspective as an eating disorders dietitian who has become very critical of diet studies and all of that stuff, the diet advice that's out there both in social media spaces and the really sensational stuff, but even the kind of seemingly evidence based, the things that have a little bit more evidence behind them, like intermittent fasting for certain things or keto for certain things or whatever, where there may be a very narrow application or this may be useful in animals, but it's not actually borne out in humans over the long haul over multiple randomized controlled trials and replicated by different groups and all that.

But then I think about this piece that I'm curious to talk to you about, because I know we have probably somewhat differing opinions on this, is the weight science piece. And that's where my view is a little different from what I think a lot of people would interpret as the body of evidence. But I think I, and a lot of people in this space that I'm in of eating disorder professionals who have an anti-diet stance, look at the body of evidence with a different perspective, which is like, okay, well, diets really don't work and we've seen that up close in our clients and many of us, me personally as well, didn't work for me, couldn't stick to it. I've seen it happen in so many people. And we see that in the research as well, that very few weight loss diets actually produce long term weight loss and have a variety of side effects as well.

And then there's this discourse out there, the general sense that I think a lot of people have of what the body of evidence says about weight, which is higher weight is bad for health. That's the top line. And I look at that and I say, okay, but why is higher weight seemingly bad for health? Is it the weight itself or is it what higher weight people are told to do in the name of weight loss, right? Being told to eat in ways that are actually pretty disordered a lot of the time and ending up weight cycling and the effects that that can on various physiological systems, the weight stigma that people have to endure at higher weight levels, the connections with poverty and food insecurity and lack of access to care and the social determinants of health, all of that stuff plays a huge role in people's well being, right? Social determinants, I think are 70% or something at the population level of health outcomes are determined by social determinants.

So, looking at all of that, my view on this body of evidence around weight is yes, we do see an association between higher weight and poor health outcomes. And there may be some level at which higher weight is causal, but we really can't know to what extent that's true until we can control for things like weight stigma, which is an independent risk factor for a lot of things. A huge number of things that get blamed on weight itself, like mortality, cardiovascular disease, diabetes, cancer, stuff like that, some forms of cancer, same with weight cycling, same with disordered eating. All of those things are such potential confounding variables that I think this narrative of high weight equals poor health is just at best really not nuanced enough and at worst maybe pretty wrong.

Timothy Caulfield: Well, I think this is a fascinating and a great topic for me, Christy, because I don't know if you remember in the book I have this section where I talk about scientific humility because science evolves, the evidence evolves. And sometimes that evolution is pushing towards more uncertainty, not less. People always think science takes us always into the direction of more certainty. Kind of on the contrary, very often as more and better research is done, we become more uncertain about a topic. So I also think that we should always be willing to change our minds, right? And I argue in the book that changing your mind based on the evolution of evidence should be a badge of honor. It shouldn't be categorized as flip flopping or searching your social media post to find something that you said in 2015 and now you've changed your mind. That should be like a badge of honor, right?

So in the book, I say everyone should keep a list of things that they've changed their mind on or they've even modified their view on because I think that that invites scientific humility. And and the characteristic of scientific humility is correlated with a whole bunch of good things, including not falling for misinformation and not spreading misinformation. But I also think it is a form of critical thinking. Recognize that you can change your mind. And in that list, I have how I thought about obesity and weight gain and the role of weight in health. I don't know if you remember that.

Christy Harrison: I do.

Timothy Caulfield: Yeah, that's an evolution from one of my earlier books, which I still think, by the way, is mostly correct. How I talked about weight in the Cure for Everything versus how I think about it now, which I think is, I hope, much more nuanced. And I'm always trying to learn and evolve with the evidence, so I don't think you and I are that far apart on that. So this topic, I think, highlights the importance of the idea of scientific humility and allowing your perspective to evolve, even if it's an area where you work. Because as you know, if you've done a lot of work on a topic, it can be hard to allow your perspective to evolve.

The other thing I think it highlights is this concept of uncertainty, especially in the diet space. And again, I know you know this more than I do. When science is so often presented to the public as if it is certain. It's not. The headline doesn't say, "This was a cohort study and the variables are many. But there's some suggestion that..." No, it'll say, let's pick on coffee again, "Coffee allows you to lose weight" or something like that. People crave definitive answers. And we also know research tells us this, and I refer to this in the book, that those correlational studies, those cohort studies, are often the ones that get the most press because they're about sexy topics, because they often feel definitive when they're not. Because sometimes the end can be huge.

The UK Biobank, there's 500,000 people in that. Not that every person in that biobank is often involved in the study, but the end can look very, very impressive. So I think it really speaks to this idea of scientific uncertainty and the need to make sure that critical thinking embeds that concept. And the other thing that I think is almost a truism here, I'm contradicting myself, I'm using the word truism, is that almost always in biomedicine and health, when you do more research, not always, but almost always, it gets more uncertain. Like the effect size gets smaller. It doesn't get bigger, it gets smaller.

Intermittent fasting, I think, is a really good example of that. When you have this exciting research and really preliminary and speculative ideas about the role of it in intermittent fasting, and the more research that we do, the effect size gets smaller and let's say, just from a weight loss perspective, it turns out to be just another diet, there's nothing magical about it. And even from more the clinical outcomes, it gets more complicated and we get a little bit of interesting stuff too. Maybe this does help for this community in a very particular way, but the magic starts to fade very quickly and I think that that is almost always the case in biomedicine, whether you're talking about early phase one stem cell research or whether you're talking about a new pharmaceutical or whether you're talking about a new diet.

So that's a good reminder to always think of scientific uncertainty and be patient and let it play out. And that comes to my last point I want to make on this topic is just how messy the research is. It's another good example of how messy the research is, especially in the nutrition space. Science is hard. It's hard to do this stuff well, as you pointed out in your question, factoring in socioeconomics as a variable just complicates everything. I've really become interested in the whole longevity noise that we're hearing now. Oh, all these magical things you can do to live longer, when the reality is the number one thing you can do is pick your parents. Make sure you have good genes and that you've been fortunate enough to land in a good socioeconomic situation. That's going to be the number one determinant. You're going to live a long time. Not seed oil!

Christy Harrison: Exactly. Just get a time machine and figure that out and you'll be fine. I think that is all super fascinating and helpful perspective. I flip flop myself on diet stuff, for sure. I am classically trained as a dietitian. I was also a nutrition journalist before that and came up in this world of so much hype around nutrition studies and believing that higher weight equals poor health, that was just in the zeitgeist, right? In the water I drank, in the communities that I came up in.

And so to start to think critically and differently about it was a real exercise in intellectual humility, I think, and forced me to reconsider so many of my priors. But now my priors are anti-diet after 10/15 years of doing this work, with multiple books out there and stuff like that. So I'm like, okay, so I also have to be flexible about these priors. And I don't think I'm going to completely disavow this point of view that I have that I think is more nuanced than I had before. I'm thinking about things like socioeconomic status and weight stigma and weight cycling, these factors that are complicating and the picture has just gotten more complicated and I feel like I don't want to be blinkered to other things that could be coming out too, so I'm trying to stay flexible, but it's hard. This intellectual humility thing is hard to do.

And I think the more expertise people get, sometimes the more ensconced they can be in their point of view. Scientists themselves can be very wedded to particular theories. Speaking of intermittent fasting, there was a really great example of intellectual humility that I think you don't see that often, at least not discussed publicly in this way, where a cardiologist at UCSF, who was a big proponent of intermittent fasting in his own life and practiced it for years, conducted a study on it.

And then the study found it didn't have the benefits he thought it would. And he was like, I stopped doing it, I stopped fasting myself. And he talked about that in an interview with the media. You may have seen this, but, he just got ripped apart on Twitter and stuff for being a traitor or whatever, some kind of shill for a big pharma or whatever people said, so it's very hard to be any sort of public intellectual and have intellectual humility and not be just tarred and feathered in the public square.

Timothy Caulfield: It's so true. And of course, you saw it happen with COVID and where people made comments in April 2020, and maybe they said them on social media and these scholars maybe made 50,000 comments in the public space, and then someone will go back and find that one less than ideal comment and post that on social media as if it's a gotcha moment and it's wrong on so many levels. Even if that was the position, allowing your position to evolve is great, but it makes being a public intellectual very difficult because then what it forces you to do is to kind of cast a chill, right? Oh, I don't want to say anything that can be used against me in the future so then you get language that is less engaging and overly cautious and then the person who's making the definitive, wrong misinformation, that statement becomes more seductive.

And I do think the intermittent fasting one is also a really good personal story because maybe I'm old and cynical so very diet out of the gate, I'm like, oh, this is B.S. and every supplement, this is B.S and that's how I was with intermittent fasting. This is complete and utter nonsense. And then they started to have some interesting actual clinical studies. Oh, maybe I should park that cynicism and watch this unfold. And so it's been a roller coaster ride with me and intermittent fasting with that cynicism always lurking in the background.

But that's okay. I think we need to recognize that it's absolutely okay to do that. That's actually kind of what you should do. You should follow that evolution of the evidence with carrying the body of evidence and your knowledge and the history of the field with you as you're watching the science unfold. I think academics should do that, but I also think the public can and should do that when they see claims, especially in the health and diet space. They should carry that with them.

Christy Harrison: Yeah, that's really helpful perspective. Digging into intellectual humility a little more. I want to talk about the nuances of that. For those of us who've maybe been told our entire lives that we're smart, maybe smarter than average or whatever, and who have some objective evidence of that, like test scores, grades, awards, being knighted, for example. I have a feeling you fall into this camp and I think many of my listeners do. And there's no way to say this without sounding totally not humble, the opposite of humble. But I've been there as well, and I've been thinking a lot lately about how smart people can be so wrong about things and fall from misinformation just like everyone else, but maybe even more so in some cases, like Elon Musk is such a high profile example of this and a real outlier in terms of intelligence. But I think there are so many examples like the Nobel laureates who embrace misinformation or Nobel disease and even very smart people in everyday life and everyday jobs that fall for this kind of stuff.

And I'm curious from your perspective, do you think people who are smart, who've always been told they were smart, maybe are particularly vulnerable to misinformation because we've been conditioned to be more narcissistic about our intelligence or more arrogant about our intelligence. Maybe we have some objective evidence that we can actually understand complex things easily and have a lot of knowledge in general, but maybe that makes us overestimate our abilities and more vulnerable to that misinformation that preys on overconfidence.

Timothy Caulfield: So you're going to hate my response.

This post is for paid subscribers