Rethinking Wellness
Rethinking Wellness
Rumor Mills, Propaganda Machines, and Debunking Anti-Vaccine Misinformation with Renée DiResta
Preview
0:00
-36:07

Rumor Mills, Propaganda Machines, and Debunking Anti-Vaccine Misinformation with Renée DiResta

Social media researcher, professor, and author Renée DiResta joins us to discuss how anti-vaccine messaging spreads online – including ways we can combat the rise of wellness misinformation in 2026 and beyond.

We also discuss how viral rumors spread on social media, why wellness influencers have become invisible rulers, the difference between moderating “good” and “bad” information, and how algorithms become propaganda machines.

Behind the paywall, Renée offers advice on how to defend science in our local communities, simple ways to amplify good information online, what appropriate skepticism looks like, and why she still uses and recommends Reddit.

The first half of this episode is available to everyone. To hear the whole thing, become a paid subscriber here.

Renée DiResta is a social media researcher and the author of Invisible Rulers: The People Who Turn Lies into Reality. She studies adversarial abuse online, ranging from state actors running influence operations, to spammers and scammers, to issues related to child safety.

In October 2024, she joined the Georgetown University McCourt School of Public Policy as an Associate Research Professor. Prior to that, she was the Technical Research Manager at the Stanford Internet Observatory.

Resources and References

Contains affiliate links to Bookshop.org, where I earn a small commission for any purchases made.


Transcript

Disclaimer: The below transcription is primarily rendered by AI, so errors may have occurred. The original audio file is available above.

Christy Harrison: I’m looking forward to talking with you about your new book, Invisible Rulers, which is all about how propagandists and algorithms are shaping public opinion and our understanding of the truth and how to fight back. And I think this is so essential to understand when it comes to dealing with wellness information and all the misinformation that’s out there. You’ve been researching these dynamics for a long time. So to start, I’d love to hear about how you got into this work.

Renée DiResta: Funny enough, I guess you could say kind of through wellness or the vaccine space. Anyway, I was a tech startup founder at the time in venture capital and tech startups, kind of overlapping. And I had my first baby in 2013. I was living in San Francisco. I had moved there about a year and a half prior. And I did this thing where you had to put your kid on these preschool waiting lists, and you had to do it right around the time of their first birthday.

So I was pulling down all these kind of data sets that you could get off the California Department of Public Health website with school vaccination rates. This is because I’d read a couple articles that I remember they had titles like, Google’s preschool has vaccination ates lower than South Sudan’s. I think that was an actual headline.

Christy Harrison: Oh, my God.

Renée DiResta: I’m not making this up. And I said, okay, I came from New York. We didn’t have what California called the personal belief exemption, where you could just write on a piece of paper, I’m not vaccinating my kid. And that was considered enough. You didn’t even have to pretend to be religious. You just said, I don’t want to do this and that was the end of it. Which was why you had some schools, particularly like in the Waldorf schools, with vaccination rates down in the 30th percentile.

Christy Harrison: Wow.

Renée DiResta: I wrote this blog post as I did this data analysis, as a kind of data science person. So I made this set of charts and graphs showing, like, the 10 years of data of the rise of these personal belief exemptions and the rise of these schools that had literally vaccination rates lower than South Sudan’s. So I did that work, and then I published this blog post, put my kid on these lists, and then the Disneyland measles outbreak happened maybe two months later. I have to say I felt a little bit vindicated because I was saying, like, come on, this is like a disaster waiting to happen. What are we even doing here, guys?

Now, one of the things that had been frustrating to me during my pregnancy was that I was being constantly bombarded with anti-vaccine content. And the anti-vaccine movement really makes an effort to get to you when you are still pregnant. You’re in these, what they call, birth month cohorts where you’re with a bunch of pregnant people who are at the same phase of pregnancy as you. And there would always be these big flame war fights. Well, you’re not actually getting vaccinated, are you? About you yourself, right? Should you get your MMR booster, let alone, are you going to vaccinate your kid after he or she is born?

Then Facebook started doing it to me after my child was born, sending me the recommendations for the anti-vaccine groups. It was almost like social media was doing the recruitment for them so I started writing about that too, saying that these two things are related. You have the social media companies that are really kind of normalizing this by pushing people into these communities. I hadn’t ever typed anything indicating that I wanted to be in an anti-vaccine group, but that was where I was being pushed.

So I started writing about the intersection between vaccination rates, the social media conversation, the very, very, very large anti-vaccine activism, specific groups that were forming on Facebook and then the extent to which Facebook was proactively recommending them to people, ways in which the anti vaccine movement was running, ad targeting, telling pregnant women and new parents that vaccines cause SIDS, which they do not, all of the kind of fear mongering and bad information that was out there.

And when the Disneyland measles outbreak happened, I started writing about it again, like a lot. And that was sort of my foray into combining data science and being in tech myself, being out in the valley myself with a lot of friends who worked at Facebook, like, just a deep understanding of what was happening under the hood and saying, not to take the groups down, just to be clear. I never argued that they should take the groups down. I think that that creates forbidden knowledge and a backfire effect. But I did advocate that they stop proactively constantly pushing them and recommending them to people. And I also thought that it was rather unethical to be accepting ad dollars to run campaigns featuring images of babies saying that vaccines cause SIDS. So that was what got me into this.

Christy Harrison: I can imagine that would be pretty unnerving to be pushed all that kind of content. Did you have any background with wellness culture? Any personal experience before that? Were you in sort of crunchy spaces or any naturopaths or elimination diets in your past or things like that?

Renée DiResta: No. Actually, I was in college before Instagram was a thing, or even, honestly, before Facebook was really a thing. I graduated in 2004, so I didn’t really have so much of that visual pressure at the time. I remember the Tumblr blogs. That was kind of where that sort of stuff lived at the time. And I remember some of the conversations about what do you do about that kind of culture? Not even wellness, but just some of the more extreme elements of it. How do you handle the ethics of people wanting to share their personal stories, but also the recognition of the implications that some of that can have?

Christy Harrison: Yeah, like the eating disorder content or that kind of thing.

Renée DiResta: Exactly. Yeah. The blue butterflies and stuff. And the ways in which people would try to use euphemisms to get past Tumblr and others saying, actually this kind of content can be very harmful, so we are not going to permit it. I don’t remember where Tumblr came down on that. I think they might have tried to say that this is a kind of content that we don’t allow you to make Tumblrs for. There was not so much the promotion and the algorithmic recommendation. It wasn’t really the way that recommender systems changed.

For those who don’t know this is that they work not only by looking at what you personally engage with. I personally was not looking for or typing in anything that it should have construed to be anti-vaccine. But the way that the recommender system works is that it looks at statistical similarities between you and other types of users that it sees as being like you. So I did make my own baby food, which maybe it thinks of as a crunchy thing. I actually thought of it as like an economic thing. And a thing where I wanted my kid exposed to spices. But again, that sort of statistical behavior leads to it seeing a similarity, a statistical similarity, and deciding that that that might be something.

This is called collaborative filtering. So it looks at statistical similarities between you and other users who are like you, and then it decides to show you content that they like so there’s a higher probability that you might like it. And then if you click on it, you have reinforced that for the algorithm, and that is a data point for the platform. This wasn’t happening back in the olden days of Tumblr content. So it really got much more precise and much more potent, I would say, as other social platforms, as machine learning got better.

Christy Harrison: So let’s talk a little more about those dynamics of how things get spread and how especially like anti-vaccine content or online wellness misinformation more broadly, or misinformation in general. It’s true that the vast majority of people support vaccines, right? And support conventional medicine in general. And yet there are some very vocal forces online opposing those things and as you detail in your book, these small groups of people with fringe ideas can really leverage social media and algorithmic recommendations to make it seem like they’re more numerous than they really are. How does that work? How did the algorithms enable that? And then how does that in turn shape the discourse and even our understanding of what’s true and what most people believe?

Renée DiResta: So first, the early, early anti vaccine groups like the National Vaccine Information center established Facebook pages as far back as 2009. So they were very, very early to really using the platforms. And one reason for this is that as the evidence piled up in the medical literature that vaccines did not cause autism and that they did not cause the litany of things like SIDS or allergies or the the myriad things that the anti-vaccine movement accuses them of, as the evidence began to be clear that that was not true, they were no longer invited onto morning shows to talk about it.

Prior to that, you had Jenny McCarthy on Good Morning America and these big, huge platforms talking about this stuff before the science was really quite, quite clear. As they lost access to that channel, when it became clear that this was just not true, they moved into social media because it gave them a way to continue to grow an audience, establish their mailing lists, reach people to, say, call your legislators about various bills that they were either introducing or fighting. So with that, they really prioritized, building up a presence. And they also did a very good job of understanding that social media is for human stories and that the anti-vaccine movement really thrives on anecdotes.

In the context of the autism stories, this is the canonical trope of I took my perfectly healthy, normal child to the doctor and then the light went out of his eyes after he got his vaccines. And everybody has heard this story a million times if you’ve seen any of this content. But it is very hard in a social media environment where that story is being told to you, particularly as it moved into the era of first person video, to contrast that with a PDF fact sheet from the CDC saying, no, this isn’t true, right?

And so they really understood the medium. They understood this model of storytelling. They understood how to make it compelling. And they understood the basics of both marketing and engagement and what algorithms responded to, how algorithms worked and what they would push into people’s feeds. And so they got very, very adept at using it. Also, as Facebook in particular was trying to figure out how to grow its audiences and how to monetize its audiences, it began to offer sophisticated ad targeting tools to businesses. And it would do things like create ad targeting categories based on keywords that people typed into their profiles.

So if they had things like “warrior mom” as their job, or if they had anti-vaccine keywords in their profiles, which many of them did because this was an identity for them, right? This was a form of activism. This was a form of a thing that they really came to social media to spread the true faith, right? That’s what the word propaganda means, actually, to propagate the true faith. As they became more and more adept at using the platforms and growing the movement and using it as a tool to spread the message, the algorithms picked up on those keywords.

And so in 2015, after the Disneyland measles outbreak, when we wanted to grow a pro-vaccine parenting page on Facebook, particularly because we wanted to help pass a law to say enough is enough with the personal belief exemption, we wanted to add target pro-vaccine people to try to activate them, to get them calling their representatives. And lo and behold, we discovered that 100% of the ad targeting categories that Facebook would pull up when you typed in the word vaccine were anti-vaccine because nobody saw being pro vaccine as an identity. It’s not your identity. You go, you get your kid vaccinated and you go on with your day. You’re not a passionate pro-vaccine activist unless that is your job, right?

And so it was a really fascinating view into just once you had invested in some of these platforms, the extent to which the algorithm reinforced and created this almost sort of winner take all dynamic where people who invested early and who learned how to create the kind of sensational titles for posts and headlines for videos, who knew how to tell stories, that really became the mechanism for reaching people and everybody else was very much playing catch up by 2015.

I’ll just say that after the fight over that personal belief exemption bill in California, which we did win, I remember talking to some of the folks from the CDC and them asking about how we had used social media to try to get the word out, which we were able to do, but not very well, not very efficiently, honestly. They said, ultimately in the end, it doesn’t matter. People will continue to vaccinate because they trust their doctors and those voices, those influencers, those are just some people online. And I was like, oh, boy, they really don’t get where this is going. And then, we have COVID a couple years later, and I just thought, okay, this is going to be a disaster.

Christy Harrison: Yeah. I mean, it just really seems like even still we’re playing catch up, the side of pro-science, pro-vaccines, you know, isn’t able to harness the tools of the algorithm as effectively as these anti-vax creators are. Where are we at today with social media recommending anti-vax content or other wellness misinformation, COVID misinformation and things like that? Because there was a time, in sort of the height of the pandemic, where they were talking a big game about and they were shutting down and deplatforming major anti-vax creators, including RFK Jr. Now he’s back and bigger than ever. So I’m curious, kind of how that’s all played out and sort of where we are today with these algorithms pushing that kind of stuff.

Renée DiResta: So during the pandemic, I helped run this project called the Virality Project, which became a political boogeyman for Jim Jordan and the House Republicans and I can talk about that. But what it set out to do was produce weekly briefings of the top most viral stories of the week for anybody who subscribed to the mailing list. Literally anybody could subscribe to the mailing list, but the people who did mostly were public health officials across the country in the federal government. We did brief Surgeon General Murthy’s team and some of the folks at HHS, some of the folks at CDC.

And what we were trying to do is just say, here are the most viral narratives and then the goal was to try to get them responding quickly with counter speech. How do you get them to tailor their messages to what people are actually concerned about? If people are actually concerned about magnetism in vaccines, that was, by the way, a real thing. I don’t know if you remember that. That one happened to go wildly viral, but there were others like snake venom in the vaccine. Or is the vaccine the mark of the beast? Which sounds really stupid to say, but Marjorie Taylor Greene was all over that and we would try to explain what online communities some of these rumors were moving into.

And we had originally started the project using the word misinformation. That was a word that was very applicable for measles vaccine false claims, right? But it became very clear very quickly that misinformation didn’t work when it came to the COVID vaccine because nobody really knew what was true. There were so many things that were happening. The disease was evolving, the vaccine efficacy was waxing and waning. There were all these variants that were coming out. And really, it was a very new vaccine. MRNA was a new platform. Nobody really knew what was going to happen. So it really became very clear that we were dealing much more with a model that was much more like rumors, right? Information that people spread from person to person.

And sometimes rumors turn out to be true. So the most effective people who got our information were these groups of frontline physicians who happened to have their own Instagram accounts and their own Twitter accounts. And Twitter at the time had two sets of policies. It had its COVID policies. It also had its standard misinformation policies. And it had a practice it was doing where it was actually trying to give out blue check marks, right? Visibility and kind of credentialing a check mark to doctors who were working during the pandemic so that they would be seen as authoritative speakers.

So when they talked and they said, no, this is real. No, Ivermectin doesn’t work, the hope was that ordinary people would see them as credible, would see this person is not just another random, ordinary voice on social media, but this person actually has a credential and is working in a hospital, should theoretically know more about what they’re talking about. So Twitter was trying to elevate and uprank good information. That was the way in which they were doing this.

When platforms try to moderate bad information, they have three possible ways to do that. They can take it down entirely. That’s called remove. They can reduce it from distribution. They call that reduce. That just means that when it’s being recommended, it doesn’t get pushed into as many people’s feeds. And then they have a third thing where they can label it, and that is called inform. So remove, reduce, inform. That’s the three buckets of moderation. One of the things that we paid attention to in the project, and when we did work like this on election rumors also, was what distribution of types of interventions platforms used.

They really do use inform quite a bit. They wanted to be pointing people to good information because when you take stuff down, you do create that backfire effect, you do create that sort of backlash effect where you make that information forbidden knowledge and then people go looking for it and then you create this whole narrative about censorship. And that unfortunately became the response where even labeling and fact checking some of the wellness influencers.

One of the things that we saw as we did this work was that the far right in particular really made being anti-vaccine, anti-COVID vaccines specifically part of a right wing political identity. And they really tried to reframe the vaccine as like the Biden vaccine, to reframe it as something that you did if you were a Democrat or if you were afraid of COVID but you didn’t get it if you were a Republican. And so a lot of those rumors, a lot of those false claims were picked up by prominent right wing political influencers who normally didn’t produce any content about health and wellness. So all of a sudden you saw it coming at people from many different sides, many different high follower influential voices. And it really had the effect of politicizing identity around that vaccine. And I think we’re really seeing that carry over into vaccine politics today.

Christy Harrison: Yeah, absolutely. You mentioned rumors and I want to talk about that a little bit because you argue that the contemporary system of influence that we’re seeing now, where these small groups of fringe or maybe controversial or extreme voices weaponizing the social media and algorithmic system, that this system of influence emerged from a collision between two distinct systems that came before, which were the rumor mill and the propaganda machine. Can you explain what those are and sort of how they’ve converged in this moment to create all this informational chaos that we’re seeing now?

Renée DiResta: Yeah, so there’s the sort of pre-written, pre-printing press oral culture of the ways in which we transmitted information to each other long before people wrote things down. For a long while there, the idea that you could control print media or broadcast media as broadcast expanded into radio and television, ordinary people didn’t have access to that. So you could maybe write a letter to your editor, maybe you could protest if you wanted to. But ultimately their engagement with content in media was to be the audience and to discuss it after the fact.

But then there was this entire other system that people had for communicating information to each other. And you still see this in authoritarian societies where people kind of know that the media is BS, that the government controls it. And so rumors are information that are passed from person to person and they convey a story and there will often be challenges to an official narrative. Rumors might be gossip about a politician, right? And so things that maybe are not ever going to be printed in official media, that model of helping inform your community about the real truth of what’s going on.

And rumors are often very sticky, right, because they sometimes can’t be proven or disproven because they sit in this space where the information about that story or that gossip might never be resolved. In a major media expose or journalistic takedown, people might feel that the story never fully resolves itself. And so the rumor keeps popping back up. And that’s one of the dynamics around it.

So the rumor mill was just the ways in which people share information kind of amongst themselves. And it used to be very, kind of small and geographically bounded. And it’s also something where you share rumors with people you trust generally, people who are kind of a member of your community. I used to do this in person, right? The bar, maybe at the card club, whatever. It was, the propaganda machine was the sort of top down controlled, “This is the narrative as it moves through the broadcast channel.” And that doesn’t mean that all media is propaganda. Just to be clear, the book is nuanced, but it goes into how propaganda works.

For example, I think in that particular chapter, Noam Chomsky’s model of what are the incentives that lead media to cover certain stories in certain ways? Do they have relationships with sources? Do they not want to piss off their advertisers? Do they not want to take on the pharmaceutical industry because pharma buys ads on their channel. And there is a grain of truth to a lot of these things. It’s sometimes not as much as Kennedy saying, oh, media is all biased and journals are biased and this and that and the other thing. But again, because there is a grain of truth, sometimes the critique is very resonant with people.

So I was trying to contrast the difference between this top down form of information control and then the way in which people pass information. And what happens on social media, interestingly, is you’re really seeing both kind of collide. You’re seeing very, very, very powerful elites who are on social media. You’re seeing algorithms that are controlled by billionaires. I wrote the book before Elon became a surrogate for Trump in the 2024 presidential election, right before he became a strident supporter of AfD in Germany, where he actively took steps to use his platform to boost particular political candidates.

And again, whatever you think of it is a thing that happened when a person controls the algorithm, so to speak, they control the curated flow of information. And so that is something that I think people need to be aware of. Meanwhile, the rumor mill, all of a sudden is no longer geographically bounded. It happens as people pass information from person to person. But we can all see it, and we can all participate in it.

And the paperback version of the book will be coming out, and I’m writing an epilogue for it. And one of the stories I’m including in the epilogue is that moment during the campaign with eating the pets. It’s the canonical example of this handoff, right? You have somebody in a Facebook group who thinks that her Haitian neighbors have eaten her, I think it’s friend’s neighbor’s daughter’s cat, so several degrees removed. But you see this in rumors. This language is so common. The person almost never is talking about something that happened to them. It’s something they’ve heard third or fourth hand. It’s like the game of telephone.

That story, it’s just weird enough that people start to share it and they’re like, oh, it’s not just the cats. They’re also eating the ducks, right? And then the Internet gets in on it, and they start making memes. You can use AI now to generate propaganda. It’s quite easy. And they start making the pictures. And what happens? JD Vance picks it up, and now all of a sudden, he starts turning it into a big story about immigration. Where are our borders? Where is Kamala Harris? That’s a huge, deep story? Who is America for? Why are “these people” in our community? That’s where he takes that rumor and so you see it then begin to move through the sort of political framing that it becomes. And it becomes a propaganda moment for JD Vance and for the Trump campaign as they begin to use this weird online rumor to tell a very, very big story.

And ironically, there’s nothing true about it. They find the cat but by then, the damage is done. And when the New York Times, I think, eventually kind of tries to pin down Vance on what are you doing here? He says, if I have to tell stories, then I’m going to tell stories. And it’s a very revealing moment because it illustrates the extent to which this system from a probably not true online story to something that can be really turned into an incredible political moment. And you see this happen, I mean, even again in the wellness space too, you’re seeing this happen fairly regularly.

Christy Harrison: Yeah. And it makes me think of the title of your book, Invisible Rulers, which is a reference to, I forget who coined that term, but about propaganda in the, was it 60s or 50s?

Renée DiResta: It was the 30s. It was Edward Bernays in the 1930s. Yeah. This is pre-World War II that he’s writing about this. What he says is there are invisible rulers who control the destinies of millions. I wound up in a conversation about this with somebody on Threads this morning where they were talking about a sort of a far right figure, Nick Fuentes. And somebody had observed, Nick Fuentes is really funny. He’s getting all these views because he’s very funny. And that is absolutely true. He is also an extremist. He has a lot of these very, very extreme viewpoints, but it’s the way that he communicates. And this other person was saying no, no, no, he’s still a niche figure. He has only this many followers. That makes him so much smaller than all of these celebrities with their millions and billions of followers.

And it was very interesting to me to see that response where people are still thinking that mass follower appeal is the arbiter of what shapes public opinion, instead of this recognition that you can have people who are extraordinarily popular within a niche and then the way in which that niche is highly, highly active and goes out and becomes evangelists for the message and that the words and the language that they use are then almost incorporated into cultural social conversations on social media. I think about this a lot because that was what happened with the vaccine conversation as well.

There are people who are just hearing about this Tylenol thing for the first time because RFK Jr just said it on stage. But the rumor that Tylenol causes autism, which scientists have been responding to and commenting on and trying to explain the nuance and for over 10 years now. I remember that when I was pregnant back in 2013, this has been a rumor in the wellness sphere and in the anti-vaccine sphere. And it has come up in pregnancy forums for a very long time. But now some of the language around that is in the national conversation.

So sometimes it’s just a matter of language taking a while to get into the culture over time. And I think that people underestimate the extent to which passionate people in a niche actually can and do drive political movements and move things into the mainstream.

Christy Harrison: These are the invisible rulers now, right? The niche groups that are making these points, whoever started the rumor about the Haitians eating pets, that trickled up to J.D. Vance. That is a form of influence that’s somewhat visible, I guess, but it’s less visible than the rulers themselves or than the people around them whose names we know. This can all trickle up from some random person posting about whatever.

Renée DiResta: One hundred percent. And I think the thing that Bernays says in the book and the thing that I think really, really matters is the extent to which, even in the 1930s, he’s writing about the importance of appealing to people’s identity and the importance of appealing to people’s identity as a member of a group, and the extent to which we look to other members of the group for validation and to kind of check, am I right here? Am I being good here? Am I moral here? And the extent to which that need and that desire to be part of a collective, to feel like we belong, is something that can be tapped into.

Funny enough, that is what his book is about. The book Propaganda, which is the book that he wrote in 1929, is a manual. It is a manual for how to do this. He is writing it for practitioners of what he comes to call public relations. He is sometimes called the grandfather or godfather of public relations because he is writing this book to try to explain to companies how to tap into this, because he had just done it as a propagandist for the United States government in World War I.

Christy Harrison: Oh, it’s so fascinating. There’s so much more in this book that we could dive into. but I want to make sure we get to the big question, which is what can we do about all of this harmful influence on our discourse? I think a lot of people have an intuitive sense that it’s just crazy out there and there’s so much bad information flying around and it’s all kind of breaking down. And society is at this place that we never thought we would see, and what can we do about it? Is this something that’s not really in our hands as individuals?

Should we pressure Congress to regulate tech companies, for example, which, for a long time I thought was the best approach, although now starting to question that, given who’s currently in power and how easily that regulation could be weaponized and we’re seeing that already in how people in power are trying to lean on tech companies, not “suppress” conservative speech or whatever. But is that the solution? And there are definitely nuances to how regulation is done. And are there solutions more at the individual level too? Like, what does that look like to sort of protect ourselves as individuals?

Renée DiResta: It’s a big question. I feel very strongly that

This post is for paid subscribers