Journalist and medical doctor Seema Yasmin joins us to discuss why misinformation and conspiracy theories about health and wellness are so alluring; how to recognize and fight back against false claims; the difference between misinformation, disinformation, and malinformation; holding the two truths that science is one of the best tools we have for finding facts and science hasn’t always gotten it right; and more. Plus, Christy shares an excerpt from the audiobook of The Wellness Trap about wellness mis- and disinformation and how they’ve come to proliferate online.
Dr. Seema Yasmin is an Emmy Award-winning journalist, Pulitzer prize finalist, director of the Stanford Health Communication Initiative and professor of crisis communication at UCLA. Yasmin served as a disease detective in the Epidemic Intelligence Service, and a science correspondent for major newspaper and broadcast outlets. She is the author of five books, including What the Fact?! Her reporting appears in The New York Times, Rolling Stone, WIRED, Scientific American, and other outlets. She received her medical degree from the University of Cambridge and trained in journalism at the University of Toronto.
Resources and References
Seema Yasmin’s new book, What the Fact?! (Bookshop affiliate link)
Christy’s new book, The Wellness Trap
Christy’s online course, Intuitive Eating Fundamentals
Transcript
Disclaimer: While every effort has been made to provide a faithful rendering of this episode, some transcription errors may have occurred. The original audio file is available above.
Christy Harrison: Welcome to Rethinking Wellness, a podcast exploring the diet culture, disinformation, dubious diagnoses, and disordered eating that are so pervasive in contemporary wellness culture--and how to avoid falling into these traps so that you can find your own true well-being. I’m your host Christy Harrison and I’m a registered dietitian, certified intuitive eating counselor, journalist, and author of the books Anti-Diet, which was published in 2019, and The Wellness Trap, which came out on April 25th and is now available wherever books are sold. You can learn more and order it now at christyharrison.com/thewellnesstrap.
Hey there. Welcome back to Rethinking Wellness. I'm Christy, and my guest today is journalist and medical doctor Seema Yasmin, who joins me to discuss why misinformation and conspiracy theories about health and wellness are so alluring; how to recognize and fight back against false claims; the difference between misinformation, disinformation, and malinformation; how we can hold the two truths that science is one of the best tools we have for finding facts and science hasn’t always gotten it right; and more. Plus, I share an excerpt from the audiobook of The Wellness Trap about wellness mis- and disinformation and how they’ve come to proliferate online.
Before the interview, a few quick announcements. This podcast is brought to you by my new book, The Wellness Trap: Break Free from Diet Culture, Disinformation, and Dubious Diagnoses and Find Your True Well-Being, which is now available wherever books are sold! The book explores the connections between diet culture and wellness culture; how the wellness space became overrun with scams, misinformation, and conspiracy theories; why many popular alternative-medicine diagnoses are misleading and harmful—and what we can do instead to create a society that promotes true well-being. Just go to christyharrison.com/thewellnesstrap to learn more and buy the book. That’s christyharrison.com/thewellnesstrap.
If you like this show and want to help support it, I’d be so grateful if you'd subscribe, rate, and review it. You can do that wherever you’re listening to this, and you can also get it as a newsletter in your inbox every other week, where you can either listen to the audio or read a full transcript, or both! Subscribe to that at rethinkingwellness.substack.com. And if you upgrade to a paid subscription, you’ll get early access to regular episodes plus occasional bonus episodes. Just go to rethinkingwellness.substack.com to learn more and sign up.
Now, without any further ado, let’s go to my conversation with Seema Yasmin. So I'd love to start by having you tell us a bit about yourself and how you came to do the work that you do.
Seema Yasmin: Yeah, very unexpectedly. So I'm from England originally live in the States now, but from an immigrant Muslim family in the UK that felt very disenfranchised and disconnected from healthcare in many ways, even though it's free in the UK, it's very paternalistic, still very fixed ideas about what we would call ethnic minorities in the UK. And same with the press, we saw ourselves misrepresented or erased in the media. I felt maybe like I wanted to counter some of that by becoming a doctor when I was younger, the first doctor of my family. So I went to med school in the UK and then kind of really quickly, even though I loved caring for patients, became disillusioned with the healthcare system, even with it being the NHS, which I revere in terms of it being free for everyone at the point of care. No one in England becomes bankrupt because they broke an arm or they got cancer, you pay your taxes or whatever, and therefore you get free healthcare.
But I still felt like we had this mindset, this approach to healthcare that was very focused on patching people up, so they were well enough to be discharged and another person take their bed. And I felt like the hospital I worked in the east end of London where I was raised, exemplified the healthcare system in general and that I felt like we operated a revolving door and my patients just kept coming back into the ER, into the acute care unit, sicker and sicker each time, and we patched them up and sent them out. So at that point, a mentor of mine told me that, hey, I think what you are frustrated by is what we deal with in public health, thinking about social determinants of health, why someone becomes your patient in the first place. And so I left the UK at that point, it was about 12 years ago to serve as an officer in the Epidemic Intelligence Service at the CDC. People know that as being a disease detective. Obviously you've seen the movie “Contagion”, it's the job that Kate Winslet does and it's outbreak response, but also thinking very deeply about social determinants of health and why outbreaks happen, where they happen and why they happen to who they happen. And even in the case of a pandemic, like COVID, why is it that at one point blacks are seven times more likely to die of COVID than they're white neighbors? Those are the things that you deal with in public health. And so that's when my transition to public health happened from being a hospitalist. Then I had another career transition. I'll talk about it really quickly, which once I was an officer in the Epidemic Intelligence Service, I noticed that every outbreak I was sent to, it was never just the pathogen that was spreading that was always in tandem, the spread of medical myths and health hoaxes and conspiracy theories. So after a few years, I left the CDC to go to journalism school to train to be a health and science reporter. So I wanted to understand why and how those anti-vaccine messages spread and how we could counter very powerful anti-science falsehoods. So that's me in a nutshell and kind of gives you an idea of my lens through which I look at the wellness space and through which I look at disparities and access to information, which is what I study now.
Christy Harrison: Yeah, it's such a fascinating career trajectory and definitely I have a lot of interest in those areas as well. So it's really interesting to hear about that. You've written that you come from a family of conspiracy theorists too, so I'd be curious to hear about that background and how that plays into the role you have now in helping debunk or dismantle conspiracy theories.
Seema Yasmin: One of the things we talk about when we train people on how to have conversations with parents who are opposed to vaccines or however that might look, is to have compassion and understanding and empathy. And people are like, oh, it's so hard when someone holds oppositional views to you or someone hates vaccines, but you believe vaccines are safe. And I get that, but it's actually really easy for me in many instances, not all, to have that compassion because I was that person. I was raised with some particular beliefs that might sound absurd or might not, but actually aren't that atypical when it comes to people who are immigrants or people who are considered the other, or people who are on the margins of society who make sense of a complicated and gray world and find belonging with each other through shared belief. And Viral BS has all these chapters. Do vaccines cause autism? Well, obviously they don't, but it goes into the evidence. But there are chapters with that asked questions like “Did the US government purposefully infect people with diseases to see what would happen to them?” And that's true, but it sounds conspiratorial cause it really should not be true, but it is. It happened. And so I think it's really important that we take all of that into consideration when we think about people's beliefs and how we hope to shift mindsets.
Christy Harrison: Absolutely. That leads into something I'm really curious about, which is that you make the point in the book that conspiracy theories can be hard to counter because there are genuine conspiracies that happen. The horrific Tuskegee experiment is often invoked as part of misinformation to drive anti-vax beliefs these days. And how do you counter that sort of conspiracy theory that is not based in fact when people are bringing up evidence of actual conspiracies that did happen and how do you tell the difference?
Seema Yasmin: Yeah, I mean having honest conversations, and one of the things that's really shot us in the foot is the fact that public health hasn't fully atoned for or acknowledged its bloody history. It took decades before the people who were experimented on and abused in the Tuskegee experiments were apologized to by President Clinton. It's taken a long time. And so that really puts us back. And even recently, a few years ago, there was a terrible outbreak of tuberculosis in the states, really, really high rates and people were lamenting in public health or folks aren't coming forward to get tested, people aren't getting their x-rays, but you looked at where that outbreak was of TB and it was like an hour’s drive from where Tuskegee happened and these things had very modern day impacts. Also, though we shouldn't just talk about historical examples, right, because medical racism for example, is very, very, very apparent today.
It's happening all the time. So these aren't just historical legacies, they're things that are happening now. So I think the thing that I find helpful with people is the kind of approach I take in Viral BS, which is let's put it all out there. And one example is my first book is about the history and origins of HIV, it's called the Impatient Dr. Langer. Cause it tells a story of this scientist who was looking, finding and very close to finding a cure for HIV/aids. And there's a part in that book where I talk about mistakes and some accidents that happened with vaccine mass vaccination campaigns back in the day in the US and its history, it did happen. People were harmed by these mass vaccination campaigns, and the problems were fixed and then many more people's lives were safe because they were able to get safe vaccines.
This was a few years ago and at the time the editor of that book had just recently had a kid and was really worried cause measles was back and whooping off is back and all these childhood diseases that can be prevented by vaccines were reemerging. And he was really worried about people not getting vaccinated and he was like, we should take this out. It's going to feed the fire and it's going to fuel people who are like, see, I told you vaccines were bad. And I was like, no, but we need to have those conversations I don’t want to delete it from the book. It's something that happened and we can talk about how accidents were made and then how things get resolved. But I see that temptation, but I think it's much more conducive to progress to have radical transparency and to put it all out there,
Christy Harrison: I think about the Tuskegee experiment and how that actually as horrific and awful as it was and took so many years to have any sort of public acknowledgement of it, did lead to more protections for human subjects and trials. So not that that's any sort of silver lining for the families that suffered because of it, but things have changed as a result of that.
Seema Yasmin: I mean, I do still worry about unethical things happening in trials and the government being unethical. So it's not like it makes it go away. But I think what I'm saying is that you are more likely to have a successful conversation with somebody if you acknowledge and agree with them that yeah, some egregious stuff has happened. It doesn't mean that vaccine X that we're talking about now, or something new is also going to be tainted by that. We can have all of those things be true at the same time.
Christy Harrison: And holding that complexity, although that's so tricky to do in social media and this environment that really privileges like black and white thinking. So I do appreciate how you unpack things with a lot of nuance in that book and Viral BS and in What the Fact as well, which I'm curious to talk about. So that's geared towards primarily teens, although it really is good I think for people of all ages. Why were you interested to write a book for that audience specifically?
Seema Yasmin: Yeah, well there was so much interest in Viral BS, which came out early in 21, and at the same time we understood these problems of misinformation and disinformation are huge. They're big problems and there aren't enough resources for young people really to make them savvy consumers of information. So what I do in What the Fact much more so than what I do in Viral BS is take a very solutions oriented approach to say, for example, misinformation and disinformation are these awful problems, but hey, there are people working in different sciences and social psychology and philosophy and other areas who are coming up with evidence-based proven strategies for countering misinformation and disinformation and we're only going to stand a chance against this problem if we use these approaches.
Christy Harrison: I think it's really well done. And I'd love to start by just defining the terms you're using misinformation and disinformation as well as malinformation, which is one that I don't think we hear a lot about these days.
Seema Yasmin: So I talk in What the Fact about staying away from the term “fake news” because of the way it's been weaponized and used to silence people, especially journalists and is often lobed like a word grenade by people in power who don't want people exposing them. They just shout fake news. So instead I want people to be specific about the language they use to describe these problems. There's an umbrella term called information disorder, which includes misinformation, disinformation, malformation, but in What the Fact, I go into other kinds of information that can be misleading too. In a nutshell, misinformation is false information that's spread where someone doesn't know that it's a falsehood and they're certainly not spreading it with any intention to cause harm. It might be someone saying to you, hey, I heard you won't get COVID if you gargle with saltwater, they don't realize that that's false and they are trying to help you, then that's considered misinformation.
Disinformation on the other hand, is false information that's spread knowingly knowing that it's false and it's spread with the intention to cause chaos and harm. So the Russian groups that spread information, a few years back when Ebola was spreading, there was an Ebola outbreak in Atlanta, there wasn't, but they spread that on social media and other platforms online as a way of seeding chaos in the US by really scaring people. And then malinformation is accurate information actually, but accurate information that's made public that perhaps shouldn't have been. So someone's private information and is done so with the intention of causing harm and chaos. So there's intent in there, which is an interesting element. And then like I said, there are other types of falsehoods, I also describe in What the Fact.
Christy Harrison: It's really helpful I think to articulate each of those types of falsehoods and the logical fallacies that you delve into as well, helping people unpack arguments and how they're problematic. You have a section about the disinformation playbook that I'd love to talk about a little bit. Disinformation being misinformation with intent to deceive. I thought there was a really interesting point in there about the recipe for disinformation is to include a grain of truth and you talk about the newspaper Russia Today, RT that has, I didn't know this, I thought that it was pretty much entirely propaganda, but you said that it's actually like 80% real journalism and 20% propaganda. And so it gives it this veneer of respectability and allows people to say, well look, it's not actually propaganda because this is real information. Can you talk a little bit about why the mixture of real and true information and misinformation or disinformation is so effective at spreading disinformation and how people can spot that for themselves and counter it when they see it?
Seema Yasmin: So information warfare is really, really powerful. Very time tested a technique of radicalizing people or duping people or manipulating them in different ways. It's been around for a long time. So what I talk about in What the Fact is this idea that it's not new. The tactics that are being used, okay, the social media part is new, the platform being used to spread it, but the actual strategies used to make a lie appear believable and to appear compelling. Those are not new strategies. They've been around since the time of the KGB and the Soviet Union and possibly before then. And they've also been used by big tobacco to lie to us, and they've been used by big oil now to dupe us as well about climate change and fossil fuels. So I think what I do in the book is kind of break down the steps that they used to lie to us and then explain why those are so convincing.
And by doing that, yeah, it's been studied that you can actually get this protection from understanding the tactics that are used. And like you mentioned, one of those tactics that I describe is having a lie that contains a kernel of truth because then it's like the whole thing seems much more believable, it's more hard to untangle. So if anyone who's watched Russia Today seen the videos, some of their reporting is really, really good, but then a large chunk of it is propaganda and it's just harder to separate the propaganda when you watch some stuff that you're like, this is fantastic reporting. So yeah, that's another one of those time-tested strategies that's known to dupe us very effectively.
Christy Harrison: And what would you recommend in that situation if someone watches the reporting and is like, this is really good, but how do you exactly know what is real and what's propaganda in there, and would you say that a source that should just be avoided completely or critically appraised? What should people do with sources like that when they know it's a mixture?
Seema Yasmin: We have to develop tools of critical thinking and teach them to young people early on. So it's about being aware of how information warfare and propaganda is weaponized, how the strategies are used. And I talk in the book about the five key techniques used to spread science, denialism, fake experts, logical fallacies, impossible expectations, conspiracy theories and cherry picking. And there's a whole taxonomy that I go into in more depth in What the Fact. And what's been found is when you teach somebody not just what, oh, this is a lie, but when you peel back the curtain and say, here are the five techniques that we used to lie to you, what happens then is people start spotting those red flags more and more. And so it gives them protection against lies being told to them not just about vaccines, but about climate or about racism or critical race or whatever. So that's one of the evidence-based approaches that I described in What the Fact.
Christy Harrison: It's really helpful. I think one other thing that you do a really great job of is articulating how stories affect us and affect our brains. There's a really great illustration of that in one of the chapters where you open with this compelling story and then you say, here's another way of telling the story and it's just boring facts. And then use that to sort of unpack what is going on in our brain when we read stories. Can you talk a little bit about why stories stick with us so much more than just dry facts and what we can do if we know that we've been exposed to a store perhaps don't know, but just if we've been exposed to stories that contain misinformation, what is so sticky about that?
Seema Yasmin: Yes, I think many of us instinctively know that, right? Facts are hard to memorize, but stories you'll remember. You'll remember the way something made you feel. You'll remember characters, you'll remember things that happen. So as you described, I do that in the book to kind of illustrate test yourself, what's more going to be more memorable. And then I explain the neuroscience of it. It's a lot to do with how our brains assimilate information that we like to think we're really data driven and logical and stuff, but actually our brains are hardwired to connect through stories. We're susceptible to stories and that's a lovely thing. It's how we learn language and cultural norms and it's how we learn how to interact with the world and others. But it can also be used against us. And again, it's kind of like this technique of cognitive immunology, this kind of immunity you can build against lies by being aware that this is one of the strategies that can be used, and it can be used for good and bad.
It can be used to share compelling information that's accurate and it can be used to spread lies. So one example is their anti-vaccine movement can be really effective because they do very targeted messaging and they do very emotionally triggering and compelling and memorable and shareable messaging. They'll do these viral videos that spread on TikTok or on YouTube of a mother crying and clutching her child who's crying and saying, my kid could talk until they got the MMR vaccine. But the way that public health and science will counter a viral campaign like that is with facts by saying, well, the vaccine is safe. And I'm not saying we don't need facts, we absolutely do need facts, but it's how we package those facts that makes all the difference in terms of whether the facts will stick, whether the facts will be believed, whether the facts will be shared. And we really fall short on that in science communication at the moment,
Christy Harrison: And it's so hard to do. I feel like a very elusive thing for people with a scientific background. I feel like your background, having been a public health professional and then working as a journalist and learning how to tell stories in a compelling way is really useful here because we do have an audience of, or at least for my first podcast, we have an audience of a lot of health professionals and clinicians, especially nutritionists, dieticians, therapists. I'm curious for folks in that boat how you suggest talking to people about facts and conveying maybe a nutritionist who's trying to counter some wellness misinformation that one of our patients is holding that's causing them to have disordered eating patterns. What are some ways of talking about that that can help people unhook from the misinformation?
Seema Yasmin: One of the things is to consider taking talking out of the equation because we felt like I am the holder of all this knowledge. I am the bearer of facts. I need to get this information across. And so when people come to my communication sessions, they come with a notebook and a pen and they're like, tell me what to say to be an effective communicator. And I'm like, put the pen down. This is what we get wrong. We think communication means talking messaging, bullet points, pamphlets, PowerPoint presentation, sentences, and actually it's not. And that this is where we fail. You don't want to be talking at people. All these studies show that pouring facts onto a conversation that's already polarized, it's already felt to be contentious. Pouring facts onto that is like pouring kerosene onto a fire. The whole thing goes up in flames. First of all, how dare we even assume people don’t know the facts already.
They probably heard a million times that we are going to say what we want to say. They may have heard these studies before, these data points, this advice. So the hubris baked into this idea that I'm going to share something that's so novel or I'm going to share something that is going to be the thing that will change this person's mind, I think speaks to the kind of paternalistic mindset we often have in science. This omniscience, I mean science is baked into that word. We all knowing, we are the sage on the stage that's going to be the person that delivers this messaging. One, you need trusted messengers to do communication and we're often not the trusted messenger for everybody, for every community. We may not look or talk or have the same shared beliefs and cultural backgrounds as the people we're speaking with. They may have some element of trust but may not be the most trustworthy person to deliver that information.
But yet here we are, we're in that position. So really what you want to do is not talk very much, at least in the early consultations and arguably all throughout what you want to do, especially when you feel like I've just got a few minutes and I like, how do I get this across? You want to ask questions that help you understand why they hold particular belief. And then you want to listen and then you want to use techniques like looping. And I described this in What the Fact, there's a script as well and how to put this into action. So it's not abstract at all, it's very practical, but you want to ask questions that help you gather intelligence on why this person believes what they believe. And without that information how would you know, for example, why a person is refusing a vaccine. What we will do is say, oh, this person seems vaccine hesitant.
Maybe they have the COVID chart, they don't want to get the booster, but it is safe. Or say to ourselves and we'll say to them, it's safe, you should get it for these reasons. Here's the ingredients, it works, blah, blah. What we're not doing is realizing as the data shows that if you have six people that say to you they don't want to get the bivalent booster, you might hear six completely different reasons for why they don't want to get vaccinated. And yet it's almost like we hit everybody with the same one size fits all blanket messaging without doing that very crucial asking of questions and really, really listening, putting your assumptions to the side and trying to understand why do they hold this belief. I think only then do you stand any chance of helping them shift their mindset or becoming open to different information.
Christy Harrison: Yeah, I think that's so true and it's so hard to do. I think as health professionals we're conditioned to have all the answers and there's so much of that that has to be unlearned. But then also I think for something like anti-vax beliefs or vaccine hesitancy, which has such potential consequences, significant harmful consequences on a person's wellbeing and on public health as well, it feels like the stakes are so high, I have to get this information across. And especially if you only have 15 minutes with a person or something, it's like, how can I convey this in the quickest possible way? And often it just comes out as like, blah, they're safe and effective and it's like there's not a context to it.
Seema Yasmin: Yeah. The really weird thing about that though is we all practice evidence-based medicine, but we're not practicing evidence-based communication. Even though there is an evidence-based for how to effectively communicate science. It doesn't come from scientists. This research often comes from scholars of communication and because things are so siloed, we're not reading what they're writing. And for decades they've been telling us that our default method of science communication, which is known and described as a knowledge deficit model, does not cut through biases. It doesn't build trust, it doesn't build relationships in which there can be a dissemination of information that goes against someone's core beliefs. So you say it's hard, and sure, I'm not saying it's easy, but what happens when I teach this stuff is over time people come back and feel very connected to their patients. And actually when we've studied it, especially at places like Stanford, you know, do these enhanced communication trainings, you're teaching people evidence-based methods and people will often come in and be look at me and be like, well, I've been a doctor longer than you've been alive. What can you teach me?
But whatever, ah, listen, you would think that based on us teaching evidence-based approaches for effective communication that patient outcomes would improve. You are hoping to foster doctors and nurses and allied healthcare professionals who really connect with patients, really listen, patient outcomes should improve, and they do, which is great. It's how we get our funding for these programs. Like look, patients get better, they get discharged, they're more satisfied. However, or in addition, what we also find is that physician wellness improves, and burnout rates decline because people suddenly feel, like I said, more connected to their patients. They've gotten more connected with the reason they went into healthcare and looking after people in the first place. So it can sound hard, and it can be because it's a paradigm shift from how we've been taught communication so far, but I promise you one, it is more effective. Two, it reminds you that it's not a one size fits all approach for everyone. In the What the Fact, I kind of want to arm people with a whole toolkit and then you decide, I know a few evidence-based strategies that would work in a situation like this. What do I think is most appropriate for this person? Or shall I pick three of them, for example? So it's empowering in many ways, and it gives us that chance against countering the misinfo demic.
Christy Harrison: That's such good advice. One other thing that's sort of related to that is that science doesn't always get it right. And there's such a complicated history. We talked about the Tuskegee experiment, but so many examples of how science was weaponized or used in harmful ways to justify racism or other forms of oppression. And yet science also gives us evidence-based tools like what you're talking about with science communication or just all the myriad things it's given us. And I think sometimes people struggle with this, especially people who are social justice minded and maybe have done a lot of learning online about the history of medicine and science and social justice aspects of that. So how do you talk to people and suggest people square those things, like both the harms that science has caused and the benefits of taking an evidence-based approach?
Seema Yasmin: Yeah, I talk about this in a part of What the Fact called spurious science. And I mean I am a scientist, so I talk about the fact that some of the originators of the field of statistics invented statistics as a method for justifying slavery and for proving in their words that they could use this field of statistics to prove that black people were inferior to white people. Does that mean we throw statistics out? Well actually we can use those same tools to show them that they were very, very wrong. So as for me, it's really about understanding that history, trying to hold some of these perhaps institutions or legacies to account, but then empowering ourselves with that knowledge and then using it in service of social justice and equity. So that's how I think about it. I mean, you think about the history of eugenics, I mean it is just science. Many scholars will tell you it was the tool that was used to create arbitrary racial classifications that again, was used to justify the enslavement of African people. So yeah, there's that bloody history, but it's also realizing that science is a process. We can acknowledge that history, we can reclaim it and we can use it now in ways that do good as opposed to ways that do bad.
Christy Harrison: Yeah, I definitely agree with that and I grappled with that myself around what role science should play in my work. I've heard a lot of people kind of come down more on the side of lived experience, well lived experience should be the thing that we use to guide our decisions in science. Some people are so radical to say science should have no place, it should all be lived experience. What would you say to that argument?
Seema Yasmin: Well, I don’t know how they're mutually exclusive. I mean, science is a process of asking questions, testing those questions, having an open mind when you're assessing the data. And to me that's like living, that's what we're doing every day. So I don't see lived experiences and science as mutually exclusive, maybe what somebody might define as hardcore western sciences exclusive to that. But I don't like that concept of western science. And also to that point of this came up recently, that point of we can't be wrong. I trained CDC scientists and WHO leaders in science communication and at WHO leadership training that I was doing a couple of weeks ago, one of the leaders was lamenting. I was talking about how to communicate uncertainty and how to do radical transparency as the way of building trust or communication hinges on trust. Good luck being an effective communicator. If you're not trusted, it doesn't happen. So this leader was talking about having to always be right and never being able to be wrong, and so therefore he was saying to me, how does radical transparency work? And I was like, who told us that we could never be wrong and that as scientists we were supposed to be all knowing and always right? That's just not how it works and what a horrible way to live and what an inhumane burden to put on people and what a lie to tell the public. Really what we need to be having is this bidirectional flow of communication and knowledge between the public, between science. We've siloed and ivory towered this to death in a way that's to the detriment of all of us, the detriment of public health. The public needs to be engaged in a scientific process. Scientific process needs to engage with the public's lives and lived experiences. So to me, all of those things are connected and all of them should be melded together if we're going to do anything in service of equity, anything in service of improving public health.
Christy Harrison: That's really well said. Well, thank you so much for this. It's been really wonderful talking with you. And can you tell us where people can find you, learn more about your books?
Seema Yasmin: Yeah, sure. Thank you so much for having me. If you want to read What the Fact or Viral BS, you can go to my website, SeemaYasmin.com or search for them online. Very easy to find and What the Fact because it is aimed at younger readers, although adults are reading it, it's accessible to everyone. There is a teaching guide we've created for What the Fact, so if you go online, I'll put it on my website soon, but right now if you just Google What the Fact Teaching Guide, the first result should be the Pulitzer Center’s website and you can find the teaching guide there. It's got lesson plans and activities and chapter guides, and we're also doing lots of school visits to make sure that young people are getting access to information about critical thinking, media literacy and social media and digital literacy too.
Christy Harrison: That's fantastic. We'll put links to that in the show notes for this episode as well. And thank you again so much for being here. It's great to talk with you.
Seema Yasmin: Yeah, thanks to your great questions.
Christy Harrison: Thanks so much to Dr. Seema Yasmin for that great conversation!
And now I want to share with you an excerpt from the audiobook of The Wellness Trap about wellness mis- and disinformation and how they’ve come to proliferate online. I think this is a great companion to the interview with Seema, and I hope you enjoy. It’s from the beginning of Chapter 4, and you can listen to the rest by buying the audiobook wherever you buy books or at christyharrison.com/thewellnesstrap.
Chapter Four, Mis- and Disinformation:
When I first started struggling with a constellation of symptoms that doctors couldn’t explain, it was a simpler time to seek out information. The year was 2003, and the only social media site I was on at the time (or even had heard of) was Friendster, one of the original social networks and the progenitor of the friend request system that Facebook later popularized. Unlike the Facebook of today, Friendster didn’t have creepy ads that tracked you across the Internet. I can’t even remember if it had ads at all back then (though by 2004 it did, according to a Wall Street Journal article from that year with the now quaint headline “Advertisers Seek Friends on Social-Networking Sites”).
I didn't post about my health concerns on Friendster though. Instead, I got my health information by Googling around finding and peeking into message boards dedicated to particular diseases, celiac disease, thyroiditis, digestive disorders. I never posted myself, but I lurked and picked up some ideas from those forums, such as the notion that gluten might be at the root of my disparate symptoms, which I much later learned definitely was not the case or the idea that something hormonal might be going on beyond my thyroid problem. Yet much of the content in those spaces never really resonated with me.
Perhaps because in some cases, as with the celiac board, I never actually had the disease or maybe because though I was desperate for answers, the people on those boards seemed so much more desperate and willing to try extreme things. Though I engaged in many disordered eating behaviors and was obsessed with healthy diets and cutting out certain foods, the severe and esoteric restrictions and unproven wellness culture practices that some of the people on those message boards were advocating didn't feel right to me at first glance, and it was fairly easy to forget about them and move on. If I were going through the same thing as a young person today, I have no doubt my experience would be very different. Now, arcane and often bizarre food restrictions are peddled by wellness influencers with millions of followers and even casually viewing or liking those pages can cause sophisticated algorithms to serve up increasingly extreme diets and harmful practices.
Those same algorithms can drive wellness seekers to dangerous pseudoscience and misinformation. Dubious alternative medicine advice that was still very much on the fringes back when I was first struggling with chronic illness now often shows up on the first page of search results driven by personalization algorithms and those sites well-honed social media and search engine optimization strategies. Much of this shift is a result of the modern attention economy and what author and social psychologists Shoshana Zuboff calls surveillance capitalism, the collection and sale of our personal data, search histories, social media likes, page views, age, location, et cetera, to advertisers who use it to precisely target us with ads designed to capture our attention far better than non-personalized ads ever could. The technology of surveillance capitalism is what allows those creepy ads to follow us around the internet, but its influence is far more insidious than just making you buy some skin serum you might otherwise have forgotten about. This technology also makes people vulnerable to mis and disinformation opening us up to scams and conspiracy theories and ultimately compromising both our individual health and our collective wellbeing, misinformation and the attention economy.
Healthcare professionals and consumers alike have been concerned about the quality of health information on the internet since the early days of what was then known as the Worldwide Web. In the mid-1990s, doctors and information specialists began to sound the alarm about the dangers of having a space where anyone could say whatever they wanted about health-related topics without any of the usual editing and filtering out of misinformation that happens in traditional publishing. Initially, this growing concern over the quality of online health content led the medical community to push for greater regulation, but it quickly became clear that content was proliferating too quickly for regulation to have any hope of keeping up. Instead, doctors and other medical professionals began advocating for market-based strategies to prevent the emergence of misinformation and combat. What did get through a number of academic and commercial sources launched accreditation initiatives, creating criteria for judging the quality of health information that appeared on various websites.
These initiatives included things like seals of approval and awards placed prominently on health sites, which at the time may have felt slightly less futile than that sounds today. Eventually, those initiatives attracted the attention of the World Health Organization, WHO, which wanted to launch one of its own. The WHO got in touch with the organization that essentially runs the internet, the Internet Corporation For Assigned Names and Numbers or ICANN, and submitted a proposal to create a top level health domain that would signify a website met stringent quality and ethical criteria. Health websites wouldn't have to use the health domain, but it would act as a reliable filter for users who wanted to search only credible sources.
ICANN's chairman at the time said it seemed like a beneficial tool and encouraged the WHO to pursue the idea further, but his support was no match for the opposition. In late 2000, numerous stakeholders successfully argued that the internet couldn't be regulated in this way, that users were already savvy enough to recognize online charlatans and that no one organization should have the power to pass judgment on thousands upon thousands of websites. Even then, not long after the advent of social media in its nascent form, there was a reflexive resistance to the possibility of having health related myths and disinformation automatically flagged opponents of regulation framed their objections in the broadest possible terms. It was, they said, a matter of nothing less than individual liberty in ascendancy was a free market neoliberal belief that would go on to change the face of healthcare. The idea that care is not a right, but an individual responsibility.
As such, the thinking went, people were free to educate themselves or not about health-related matters. In this view, regulating what people could say about health would amount to restriction of free speech. Flash forward 30 years, and we're having a not dissimilar debate about whether social media companies have the right and a responsibility to remove health mis and disinformation from their platforms. In my view, it's more urgent than ever that tech companies step in and stop the spread of false health and wellness claims online, but at this point, they have no real incentive to do that. Under Section 230 of the Communications Decency Act, tech companies aren't legally liable for user generated content, which means people can say whatever they want on social media and other tech platforms without those platforms having to worry about being sued when users inevitably post false libels or otherwise harmful things.
Traditional publishers don't have the same freedoms and are often rightly held liable for content they create and disseminate. But under Section 230, platforms like Facebook, Twitter, YouTube, and others aren't considered publishers. That wasn't always the case. In 1995, the brokerage firm, Stratton Oakmont, whose co-founder Jordan Belfort was portrayed by Leonardo DiCaprio in the Wolf of Wall Street, brought a successful defamation suit against the internet service provider Prodigy, which the New York Supreme Court deemed to be a publisher. The rationale was that Prodigy had exerted editorial oversight by moderating some posts and prohibiting certain content, and therefore, it wasn't simply acting as a content distributor in the manner of a news stand or bookstore. This decision riled two members of Congress who worried that it would make tech companies less likely to moderate any content leading every website to become overrun with pornography. So they worked together to get a pivotal sentence included in the 1996 Communications Decency Act; “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Section 230 is the foundation of online life today, the 26 words that created the internet as it's come to be known, social media probably never would've become this big, this wild, this unwieldy, if it existed at all, without Section 230 amending it to close the publisher loophole, at least in certain cases, which lawmakers have floated several times over the years with varying levels of bipartisan support, would force social media companies to ensure that their platforms don't allow mis and disinformation to go viral. The move would dramatically curtail the spread of false information. So far as of this writing that hasn't happened. A key reason tech companies are slow to act in preventing false claims is financial. Social media algorithms are designed to promote engagement and keep people liking, clicking and sharing on the platforms, which in turn brings in more advertising revenue for them.
And what kind of content does that best? Content that provokes anger and disgust, uses moral emotional language and or introduces seemingly novel information, as numerous studies have found. That's why the YouTube algorithm tends to pick up and promote videos that have words like hates, destroys, obliterates and slams in the title, and why every word of moral outrage added to a tweet raises the retweet rate by an average of 20% as journalist Johann Hari wrote in his 2022 book, Stolen Focus. If it's more enraging, it's more engaging. Falsehoods often hit all of those notes, novelty discussed and moral outrage, and that combination can make them spread like wildfire. In a 2018 study, a group of researchers tracked more than 125,000 rumors disseminated by more than 3 million Twitter users, and found that false claims diffused significantly further, faster, and more broadly than the truth. The truth rarely reached more than a thousand people.
While viral misinformation routinely racked up audiences of between 1000 and 100,000 and it spread about six times more quickly. False information was also 70% more likely to be retweeted than the truth, and it was mostly real people, not bots doing the retweeting. The spreading misinformation in turn are rewarded with likes and shares, which reinforces this dynamic. In the current social media landscape, lies often go viral while the truth languishes. That's true for rumors about all kinds of subjects from politics to celebrity gossip, but false information about health and wellness is particularly prolific as one physician and TikTok influencer estimated in 2022. For every large creator on the platform who is genuinely evidence-based, you've got 50 or 60 big creators who spread misinformation about health. That includes harmful diet advice, unsubstantiated claims that food ingredients cause cancer, the false notion that only natural medicine can be trusted, myths about contraception, recipes for potentially deadly herbal abortions and many, many more untrue and misleading ideas across all social media platforms and related technologies such as YouTube.
And when misinformation deals with health and wellness, it poses an extreme risk to anyone in its path. Perhaps the deadliest kind of all is vaccine misinformation, which we'll discuss in detail in the next chapter.
So that’s our show! Thanks so much to our guest for being here, and thanks to you for listening. If you enjoyed this conversation, I’d be so grateful if you’d take a moment to subscribe, rate, and review the podcast wherever you’re listening to this. AND you can get new episodes delivered by email every other week by signing up at rethinkingwellness.substack.com, where you can also become a paid subscriber for early access to episodes and to help support the show.
If you’re looking for help healing your own relationship with food and breaking free from diet and wellness culture, I’d love for you to check out my online course, Intuitive Eating Fundamentals. You can learn more and sign up at christyharrison.com/course. That’s christyharrison.com/course.
If you have any questions for me about wellness and diet culture, you can send them in at christyharrison.com/questions for a chance to have them answered in my newsletter, or possibly even on this podcast sometime in the future.
Rethinking Wellness is executive produced and hosted by me, Christy Harrison. Mike Lalonde is our audio editor and sound engineer. Administrative support from Julianne Wotasik and her team at A-Team Virtual. Album art by Tara Jacoby and theme song written and performed by Carolyn Pennypacker Riggs. Thanks again for listening!