In late December 2014, several visitors to Disneyland fell ill with measles, a disease that supposedly had been eliminated in the United States more than a decade earlier. Over the next month, the outbreak spread to more than 120 people in California, including a dozen infants; nearly half of the infected weren’t vaccinated. The outbreak was a predictable outcome of the state’s having allowed parents to opt out of having their school-age children vaccinated because of “personal belief” unconnected to medical or religious reasons.
Renée DiResta was then a mom looking for preschool programs in San Francisco. Her discovery that some schools had vaccination rates for routine childhood shots that were lower than in some of the planet’s least developed countries, combined with the shock of the Disneyland outbreak, led her to become active in the movement to eliminate the personal-belief exemption. But her background in finance and venture capital only hinted at how anti-vaccine misinformation increasingly was spreading across social networks. Her attempt to counter the anti-vaccine movement gave her what she called “a first-hand experience of how a new system of persuasion — influencers, algorithms, and crowds — was radically transforming what we paid attention to, whom we trusted, and how we engaged with each other.”
In this podcast discussion, DiResta relates how the viral qualities of social media have transformed right-wing influencers into what she calls, in the title of her new book, Invisible Rulers: The People Who Turn Lies Into Reality. She discusses how her experience with the online anti-vaccine movement led her to become active in projects assessing how foreign adversaries were influencing Americans via social media and the internet, and eventually drew her into other controversies, including COVID-19 vaccine conspiracies and Trump supporters’ 2020 election denialism. In the process, her adversaries created a firestorm of false allegations against her, charging that she was a CIA operative running a global scheme to censor the internet — allegations that were eagerly received and acted upon by bad-faith members of Congress. DiResta’s story illustrates the malign nature and vast scale of emerging online threats to the democratic process, and also offers some suggestions for how governments, institutions, and civically engaged citizens can combat those threats.
Transcript
Renée DiResta: This is not a technological problem. This is a human problem. And this is an infrastructure for communication and for attention. And when you have that, people who are powerful, who want more attention and to capture an audience, reach an audience, are going to continue to work the refs. This is basically a gigantic ref-working campaign.
Geoff Kabaservice: Hello! I’m Geoff Kabaservice for the Niskanen Center. Welcome to the Vital Center Podcast, where we try to sort through the problems of the muddled, moderate majority of Americans, drawing upon history, biography, and current events. And I’m delighted that we’re joined today by Renée DiResta. She is a writer and former research manager at the Stanford Internet Observatory, who has served as advisor to the U.S. Congress on ongoing efforts to prevent online and social media disinformation. And she’s the author of the terrific new book, Invisible Rulers: The People Who Turn Lies Into Reality, which has just been published by PublicAffairs. Welcome, Renée!
Renée DiResta: Thanks for having me.
Geoff Kabaservice: And congratulations again on Invisible Rulers. I personally found it too upsetting to be called an enjoyable read, exactly. But it is a thoroughly absorbing tale of how the internet and social media, which only a few years ago promised a sort of near paradise of digital interconnection, was turned into the hellscape that it is now. And although you do hold out some hope of improvement, one of the main thoughts that occurred to me while reading your book is that Dante’s Inferno is a set of descending circles, and I very much doubt that the internet has come to rest at the bottom of that funnel right now. Invisible Rulers is also not just a history and survey of this blighted terrain, it’s also a personal history of how you came to be involved in the plot.
Not to be coy, but many of the trolls and imps and fiends who have brought us this infernal realm have seized upon you as their personal devil figure. Jonathan Rauch’s recent review of your book is titled “The Censorship-Industrial Complex: The Right’s New Boogeywoman” — that would be you — and your most recent piece in the Atlantic, “My Encounter with the Fantasy-Industrial Complex,” is subtitled “Online conspiracists turned me into ‘CIA Renée.’” It does seem that what led you to become “CIA Renée” as well as an expert and victim of internet virality was the literal viral outbreak of measles with which you begin your book at Disneyland in Southern California in December 2014. Can you tell us how that incident started you on this journey?
Renée DiResta: Yes. I had my first child in December 2013, so he was right around one year old when the Disneyland measles outbreak happened. And I had just begun doing the work that you have to do to get your kid on preschool waiting lists in California. Everybody’s got to do this: put your name on a list a year in advance and hope for a spot. I decided I wanted to send my child to a daycare or preschool with a decent vaccination rate. And literally it turned into pulling down California Department of Public Health data and realizing that there were — how did the media put it? —some with rates lower than South Sudan’s in measles uptake, and rates of 30, 34% for the MMR vaccine. I found it actually outrageous, and so I was looking for one that wasn’t like that.
Then the Disney measles outbreak happened, and I actually called my local congressman — Mark Leno at the time — and I said, “Hey, is there nothing that we can do about this? This seems really ridiculous.” And he said, “You should talk to Richard Pan,” who’s a senator from Sacramento. He’s also a pediatrician. He was going to be introducing a law to eliminate what were called the “personal belief exemptions,” the opt-outs that are just for the sake of opting out. I had come from New York where you had to either have a religious exemption or a medical exemption, and those were the two options. California had this third. And so Senator Pan was introducing this bill to try to reduce that, to eliminate the “personal belief” and to have the medical exemption be the thing that decided whether kids were going to get vaccinated.
And it was really just me wanting to get involved in that campaign. I was like, “What am I good at? What am I useful for? Well, I do understand social networks.” So I started doing network mapping to try to understand the anti-vaccine movement and the people who were going to be our opposition — and were in fact our opposition during that political campaign during that sort of fight to pass that bill. And it was really wild. We did what every new activist does, which is we made a Facebook page — as you do — and then we had to get people to follow the Facebook page. And this is when we realized we could do paid targeted ads, and this was when I realized that when you opened up the ad tool there were things where, if you typed in the word “vaccine,” you could target ads to people with anti-vaccine interests; that was built into the system as a formal option. But there was no such thing for pro-vaccine. There was no pro-vaccine ad targeting setting. There was warrior moms on one side and nobody on the other.
So then we decided, okay, well, we have to grow a movement. And then we realized that we could run zip code-level targeting to… I started doing searches for “-ologists,” trying to find every person who had some vague scientist or medical affiliation. That was what I was going to target ads to. And it was really just this wild campaign of realizing that we didn’t have to disclose who we were. We could micro-target people with very, very high-precision categories. And we could run automated accounts to push out content into the Twitter hashtag that people were looking at. We realized we had to do that because they already were.
So this turned into, “Well, let’s learn about bots. All right, who’s going to do it? Me? Okay, cool.” And it was just being sort of thrown into the realm, and — this was in 2015 — feeling like I was seeing the future, right? This is what every single campaign was going to look like going forward, and if I could do it, anybody could do it — and probably everybody was. And that was what really got me very interested in how do you shape public opinion in the modern era.
And then the other thing that I was doing was using network mapping, network analysis to build maps of influential nodes of anti-vaccine networks, and realizing that they really just owned the board and everybody in public health was not even in the conversation. Pro-vaccine voices were not even in the conversation. So we were feeling like we really had to rectify that, and that was what got me very interested in just how public opinion is shaped today.
Geoff Kabaservice: That’s an intriguing superhero origin story. In the interest of full transparency, I should point out that this, for the first time, is a podcast being recorded in front of the proverbial live studio audience. And I’m also joined here as co-interviewer by Berin Szoka. He is the founder and president of TechFreedom, which is a Washington-based nonprofit, nonpartisan think tank that, in the words of its website, “digs deep into the hard policy and legal questions raised by technological change.” I should also add that Berin made Washingtonian magazine’s coveted list of “Washington DC’s 500 Most Influential People of 2024” — congrats on that! But more to the point, Berin is a real expert in this field, who has been deeply immersed in all the issues that we’ll be discussing today, for many years now. And in fact, you have lots of points of commonality, Berin and Renée.
Before we head over to our conference on “Liberalism for the 21st Century,” which is being put on here in DC by the Institute for the Study of Modern Authoritarianism — this will be my fourth conference this week, by the way; July times are not high times for the Washington think tank and policy community — Berin, you’ve come across a lot of books about the internet and how we came to be where we are. What stood out to you about Renée’s account?
Berin Szoka: I’m a big fan of Renée’s work — I have been for a long time — so I wasn’t surprised that she hit the ball out of the park in telling a story that could be understood by normal people. I’ve struggled to communicate in my own writing what’s going on, what the internet is doing, and in particular why it has become the political hot potato that it has been. I think she tells a story that anyone can understand. She distills media theory and summarizes the way that media work today in very relatable human stories, and then brings those stories to life in real events that happened, that I have lived through since 2018.
What she has described, starting with the anti-vaccine movement, of course has become front and center in the culture wars over the political implications of online speech. And people who like to spread complete nonsense and lies and rumors and so on are very angry about having their content moderated or not getting the reach that they feel they deserve. And they have turned that into a wedge political issue and attempted to use the government in a variety of ways to essentially force carriage of that lawful but awful content.
Geoff Kabaservice: Renée, I always talk to people on this podcast and ask them something about their personal origins: where you grew up, where you went to school, how you came on the journey that led you to the subject of your writing. So can you tell me something about your own background?
Renée DiResta: Yes. I grew up in Yonkers in New York. I interned for the CIA when I was an undergrad.
Geoff Kabaservice: How long of an internship was that?
Renée DiResta: Five years. I was an undergrad for five years. I did two degrees and two minors, which in retrospect was stupid, but at the time seemed to make sense.
Geoff Kabaservice: And how long was your internship?
Renée DiResta: It was those summers, so that would’ve been ‘99 to 2004.
Geoff Kabaservice: Okay. And where were you in undergrad?
Renée DiResta: Stony Brook. I went to SUNY.
Geoff Kabaservice: I hasten to point out here, from my own parochial standpoint, that the recent president of Stony Brook University is now the incoming president of Yale University, my alma mater. So we have that in common. And what was your major?
Renée DiResta: Computer science and political science. And I minored in Russian.
Geoff Kabaservice: As one does. So what did you do after graduation?
Renée DiResta: I decided I did not want to stay at the Agency and I went to Jane Street Capital. I actually thought I was going to go to law school. It’s sort of funny being in a studio audience with a bunch of lawyers. I took the LSATs, thought I would apply, but got this job at Jane Street as a junior trader… Well, actually it wasn’t even junior trader; I was doing cert work. I was writing code to basically scrape from Bloomberg terminals — this was in the pre-API days. And I just really got hooked. I loved the dynamic of high-frequency trading and derivatives, arbitrage. I preferred math to computer science. I did a bunch of math coursework in my CS degree. I really liked applied math, really liked the more numerical aspects of it, and didn’t like writing code. And all of a sudden I found this world of trading and wound up spending eight years there.
Geoff Kabaservice: And how did that background in finance and venture capital prepare you for the move into analyzing social media and the internet?
Renée DiResta: I really love quantitative work, and what we did at SIO was very important to me, to have that pipeline from quantitative research through to policy — not just taking and synthesizing other people’s work, which is great. But I wanted to have that from start to finish: We’re going to build ingest pipelines, we’re going to analyze the data ourselves, we’re going to formulate opinions based on quantitative research, and then we’re going to take that through to public communication, to policy — and then academic papers were the last piece of it. Since I didn’t come up through academia, the academic publication process wasn’t really the main incentive for me. Incomplete information, high-stakes situations, crazy interesting rapidly-evolving scenarios — I always loved that aspect of it. And so I think that was what’s, I guess, common to the work I did at Jane Street, that same kind of thing.
Geoff Kabaservice: That’s terrific. In terms of getting into the internet, I assume you had been a casual user of social media like everybody else around that time.
Renée DiResta: Oh, I was like an early adopter. I was sneaking into BBS and AOL and things when I was in middle school. My dad taught me how to code when I was really little. I’ve been on every network as it has emerged, with the exception of MySpace. I never joined MySpace.
Geoff Kabaservice: And just leaping ahead a little bit here, you used the acronym SIO, which is the Stanford Internet Observatory. This is an organization that is perhaps no longer going to be with us, is that right?
Renée DiResta: I’m not totally sure what they’re going to do with it. It had been a place to study adversarial abuse online, abuse of current information technologies. And so that took a couple of different forms. Sometimes that was trust and safety work, looking at individual… the experience that you might have on the internet that is bad — meaning pro-anorexia content, child safety issues, exploitation content, non-consensual nudes, these sorts of things. All the different facets of trust and safety — spam and scams fall under there. But it also studied things like information integrity; state-actor information operations and things like that were part of the work. Generative AI and emerging technology was part of the work. I think a big chunk of that is no longer going to be done, but they’re going to refocus it around child safety to be more aligned with the research interests of its new director.
Geoff Kabaservice: We will return to the SIO, but let’s go back to the project Vaccinate California with which you were involved. In a curious way, it seems that the events around the pro- and anti-vaccination online battles in California in the middle of the last decade really prefigured a lot of the battles that would come, especially around the outbreak of the COVID-19 epidemic. Can you tell me what you first encountered in terms of how the anti-vaccination forces operated online and what made them successful?
Renée DiResta: In 2015, I think there were two things. One, they told human stories; this is I think the thing that was first and foremost. It was primarily parents who’d had a bad experience of some sort, so they had a story about either their child or somebody close to them. And that’s a very… When you position a story against a PDF with a bunch of statistics, these do not have the same emotional resonance. A story about how this parent believes that their non-verbal autistic child was injured by a vaccine… You can point to all the evidence in the world and say the evidence clearly shows that vaccines do not cause autism, and yet pointing to studies and citations is not the same thing as hearing that story from the person who is telling it. And one of the things that social media really did was, as I noted, that you could add target, it turned out, and so you could reach people with human stories.
And I remember seeing ads when I had first had… It was maybe right when I had my second baby, where it was ad-targeting around how vaccines cause SIDS. It was pictures of babies who had died of SIDS, sudden infant death syndrome. And this is the thing that every parent who has put a baby to bed between one and four months old, or zero and four months old — you have that in the back of your mind that the worst case is always possible. And so when you’re being deluged with paid ads targeting you, telling you vaccines cause SIDS, it is not a thing that you easily forget. It’s a very persuasive type of communication. And the fact that you can add target to people who are new parents or pregnant, for example, makes it incredibly impactful.
The CDC is not out there doing counter-campaigns with compelling content trying to put these theories to rest. Instead, you have maybe an ad-hoc band — which is kind of what Vaccinate California was — of volunteers, physicians who decide to pick up communication as part of their social outreach, trying to counter these narratives. But they don’t have a budget behind them. They’re not running ads, they’re not paying to target people, they’re not creating groups. So it was very much a sense that you were bringing facts to a knife fight, so to speak. You were not really in the same game.
And then the other piece of it was the CDC foundationally did not understand modern communication — not from a storytelling standpoint, but I mean from a basic structural standpoint. I tell the story in the book of how Vaccinate California gets invited… Some of us were invited to present at a CDC conference, and we were down there, and we were telling people about this positive pro-vaccine campaign we’d run and how we had tried to build a movement and so on and so forth. There was no funding for it; we kind of went there hoping that there would be some funding. And what wound up happening instead, though, is some of the people that we spoke to were actually quite dismissive. They were like, “Oh, well, these are just some people online.”
Geoff Kabaservice: That particular phrase appears again and again in your book.
Renée DiResta: I’ve never forgotten it: “These are just some people online.” And I thought, “Oh my God, this is a disaster. This is going to be a disaster. They’re not going to get it, and we are going to continue to rely on this idea that because you have a credential or a white coat, or you’re in a position of authority, that people will continue to respect institutional communications. And that is not what is going to happen.” And you could see what was going to happen. As far as I was concerned, again, I felt like I had just seen that this was how every campaign was going to be fought, and these people were not even going to be in the battle. And that’s the future.
Geoff Kabaservice: The flat-footedness of institutions is a major theme of your book: how they’re completely unprepared to deal with this new viral challenge, and then how once they understand that they’re under threat, they don’t understand how to deal with it. And that’s largely because they’re operating in a very different tradition, a tradition where a lot of social trust is accorded to experts and institutions. And that’s no longer the world that we’re in.
But what was very interesting about your book was that on the one hand, you’re actually going back to some of the oldest patterns of humanity. You point out that in medieval villages there typically would be one person or several people who knew more of the gossip and whose opinion counted disproportionately. And therefore if a medieval pollster had been going into the village, that person would have a very different view talking to the majority of the inhabitants versus talking to one of these two particular influencers, if we can call them that.
But then again, your book is also picking up on a lot of themes that surfaced in Martin Gurri’s very influential book, The Revolt of the Public, which points out that a lot of the past patterns of deference to expertise and authority and institutions simply no longer hold in the leveling age of the internet.
Renée DiResta: Yep, absolutely. I am an institutionalist. I believe we need norms, values, and institutions are what underpin society. And when the institutions begin to weaken… Again, what I was trying to do in those conversations early on was to say, “Here’s what’s coming. You can change. You might want to pay attention to this. This is important.” But what I try to point out in the book is there’s two things happening here. One is the institutions not adapting, not understanding, and not recognizing what the internet can do. There’s actually this fascinating article… We’re all out here in DC, and there’s the NATO conference that’s going on…
Geoff Kabaservice: We’re going to be watching motorcades go by, shimmering in the hundred-degree heat…
Renée DiResta: But what was really interesting was there was this article in the Washington Post about the NATO conference inviting a bunch of influencers to report on it. And so that’s the kind of thing where I see that and I feel briefly encouraged that maybe some people are starting to get why they have to be there. But then it was very much this idea that they didn’t really adapt to understanding what was influential and who was influential and how to shift their communication strategies in response to that.
The other thing that institutions are dealing with though, on the flip side, is that they are under attack quite deliberately because it is very effective for people who are establishing themselves to create a devil, to create a target. We have seen this with the New Media-versus-media dynamic: “The media lies to you, the institutions lie to you.” And so there is also this concerted effort to attack institutions, oftentimes quite unfairly. And so you have both the real problems that they do have and then the manufactured problems that continue to galvanize the distrust. So they’re not particularly effective right now on both of those fronts, or on either of those fronts.
Geoff Kabaservice: I’d like to quote from the book here because I think this is an important descriptor. Renée, you wrote: “This is not a book about social media. There are enough of those. Rather, my focus is on a profound transformation in the dynamics of power and influence, which have fundamentally shifted, and on how we, the citizens, can come to grips with a force that is altering our politics, our society, and our very relationship to reality.” It’s interesting because a lot of the people who have gone after you are, let’s face it, on the political right, and yet it seems that the Republican Party’s journey to this place has been rather gradual and not entirely intentional. Berin, I wonder if you could talk about how you and Renée both had some early interactions with Republican hearings on Facebook, for example, and how this fit in with the trajectory that she’s describing.
Berin Szoka: Yeah, I would say it wasn’t so much gradual as it happened very suddenly and then became an obsession on the right. And in 2016 it was a relatively minor obsession, and then it exploded and became what is now at the center of the MAGA crusade; it’s one of six pillars of Trump’s campaign. I’ll just reset to where I started interacting with your story and then where I think we ultimately met. This was in 2016 — you tell the story extremely well — that in the primary season in March of 2016… Well, I’ll let you tell the story. But Republicans were very upset that they felt that they weren’t getting their fair share of attention in the new “Trending Topics” feature on Facebook. So why don’t you tell us that story?
Renée DiResta: These were the “fake news” fights of 2016. Now “fake news” has a whole other meaning, but it originally meant bullshit news, news that was demonstrably untrue. And there were these headlines that would… Facebook, for those who maybe don’t remember eight years back, had a “Trending Topics” feature. It was on the right side of your page, and you could click in and see by category what the various trends were. And then there was the “top trends,” and you would see some of these stories trend that were on blogs that looked like right-wing blogs; they were clearly catering to right-wing audiences. And it was stories like “Pope Endorses Donald Trump,” “Megyn Kelly Fired by Fox News,” or a number of these stories. I remember those two headlines because what they actually turned out to be was Macedonian teenagers running content farms that were just incredibly effective at making things trend and catering to perhaps a very credulous audience base.
But also, as we were talking about earlier, you could target that content to people. And then if enough of them engaged with it or shared it, it would hit other people’s feeds, their friends’ feeds, and then their friends’ feeds — and then you’d get this trend. And so Facebook had to decide what to do about this. Just keep in mind, these are spammers — these are literal spammers. This is not actually high-quality content that we’re talking about here. They were literally out there to profit. There’s no ambiguity about a lot of this stuff.
But what it turns into… So Facebook, somebody… I’m trying to remember which outlet covered this, whether it was the Daily Beast or who broke the story. But Facebook had hired human curators to try to keep the wildest stuff from trending — given what was trending, I can only imagine what was not. But they hired these human curators. And this media outlet reported that several people who had been on the team had said that they had been filtering out conservative news stories. And this just exploded.
This became a huge deal because then Glenn Beck went and visited Mark Zuckerberg. There was a whole group of conservative media leaders who went to Facebook to meet with Mark to try to get to the truth of whether or not conservative content was being censored. And they had a conversation. I think they actually came away with a fair amount of goodwill, if I’m not mistaken. They came away and I think Glenn Beck said, “No, this was all a misunderstanding, et cetera, et cetera.” But Facebook was really spooked by this, and they fired or let go the entire human curation team. And then trending was pure, algorithmic stuff.
And I remember it at the time because I was fascinated by trending algorithms on Twitter, on Facebook. This was my hobby at the time: what is trending and why and how. I remember I had a friend who was on Facebook Data Science. I had a startup at the time — I was in the Valley, also in tech — and I would send him these screenshots from my own trending feed. And I was like, “I’m in this science section, and there’s a witch blog talking about Mercury in retrograde. And maybe because it says ‘Mercury’ it’s in your science trends, but oh my God, what did you guys do over there? This thing is useless.” And that’s when you realized the human curators were in there maybe trying to keep the wheels on the bus. And Facebook eventually just nuked the entire feature. They couldn’t control it — not “control” from a we-are-in-there-to-suppress-this viewpoint, but more from the standpoint of the sensational spammy stuff was just what was going viral all over the platform. And so the trending feed was essentially useless.
Berin Szoka: Well, it didn’t happen for no reason, and it wasn’t just that they got spooked. Republicans on the Hill threw a fit about this. They responded to anger from grassroots and from very visible, very online people. You’ve mentioned the outrage about this. And in response, John Thune — who I think is certainly very far from being the most populist of Senate Republicans — sent what actually in retrospect is a fairly tame, tamely worded letter just inquiring to know more about how the human moderation of the Trending Topics feature worked, what the training was, how they assured political neutrality. And the company, because of the pressure that they received from the Hill, and the signal that Republicans were mad about this, and the narrative among Republicans which was that there was no way to ensure political neutrality if human beings were involved because the human beings would inevitably be liberal San Francisco elites — that’s why Facebook killed that feature.
And that’s why they turned it into the free-for-all that you’ve just mentioned, which opened the door not only to the witches’ stories in the science section but weaponization of that by Republicans in the 2016 campaign. It was a very close election. It would not have taken very much manipulation of those tools to swing voters. I am not a quantitative expert, so I’m not going to say that that’s what steered the election, but it was one factor. And more importantly — you tell this story very well, Renée — it demonstrated that you could use outrage over so-called censorship to get changes made by the large tech companies in how they moderated or curated content. And that is the game book that we’ve seen used over and over again, and that really was where this started.
That’s how I got interested in this topic and started looking for people like you who were studying this quantitatively, because I didn’t really understand what was happening. All I saw was the political side of this, and that this was the first really successful example of what we call “jawboning,” which is of course an allusion to a biblical story where one of the heroes of Israel manages to slay 10,000 of the enemy with the jawbone of an ass; in other words, using a very, very small thing as a weapon to do great damage. And that’s exactly what that letter that Thune sent — which purported merely to be asking questions — did. We’ve now seen that be done again and again and again by people who are now full-time populist demagogues, who have devoted themselves to that kind of weaponization of the federal government even as they lead efforts purportedly against weaponization of the federal government. It is of course projection: they are the ones doing the weaponizing.
Geoff Kabaservice: So it’s actually hard to remember this, but a decade ago in the years before the 2016 election was almost a golden age of bipartisanship, and both Republicans and Democrats were worried about the potential for foreign adversaries to be influencing us via social media and the internet. You pointed out that in 2014, there seems to have been the first introduction of Yevgeny Prigozhin’s Internet Research Agency troll farm into some of our social media, with a hoax about a Colombian Chemicals leak. And then in the following year, you were invited to participate in a US Digital Service project working with the State Department to assess the social media presence of ISIS, the terrorist organization, and the government’s response. Can you tell me something about that?
Renée DiResta: That was really fascinating. I was sort floored to get the invitation, first of all. U.S. Digital Service was this effort to — it’s ongoing, but an effort to bring people who work in tech or people with some sort of tech skills to do what they call “sprint teams,” basically to go into a federal government agency and to do some work that might be a short-term thing like fixing some websites, or providing some sort of tech guidance, or it might be something more conceptual. I was asked to help advise on understanding ISIS’s presence on social media. And I said, when they asked, “I don’t know anything about ISIS.” And they said, “No, no, no, we have other people who know about ISIS. But what we don’t really understand is the social network.”
I had been writing at this point… I felt like I was Chicken Little: Maybe the sky was falling, maybe it wasn’t, but I should tell people about how the anti-vaccine fight had actually gone. I wanted people to understand: Here’s how you look at these networks. Here is what they do, here is how they work, here is the process, here is how you grow a fandom. And they said, “We want you to come and put some of the dots together, connect the dots on how ISIS is doing it.” Because by this point, ISIS had essentially grown a virtual caliphate. There was a team at Brookings that did what was called the ISIS Twitter Census, basically looking at their presence on Twitter. There was a huge debate. And if you search for this, it’s really interesting to go back in time and read the commentary about it, because Facebook reacted pretty quickly and they didn’t want them on the platform; they took them down. Twitter largely left them up. And this was this big debate about one man’s terrorist is another man’s freedom fighter, and is this a slippery slope? If we take down ISIS, what does that say about free speech and about our commitment to diversity of viewpoints?
However, as ISIS continued to grow, you had these atrocity moments like the Bataclan massacre. ISIS was extraordinarily adept at producing “propaganda of the deed,” these sort of sensational, terrible images and attacks on hostages, all the various things that they did. And that was one side of ISIS’s content. The other side of it was this very human approach. Here are the sort of influencers of the movement. We called them the Jihadi brides, the women who had actually gone over to marry ISIS fighters, and then began essentially Instagramming their life: connecting with people, moving them to Kik, engaging in group chats, telling them… Or they would find each other on Twitter and move to this messaging app called Kik, and that was how they would tell them, “Oh, here’s how you buy a ticket to Turkey, then you’re going to cross the border to Syria, et cetera, et cetera.” And so it became a recruiting pipeline.
It was interesting to me… I was there largely to talk about networks and social media, but there were other people who were… There was one woman in particular who was an expert in building brands. And it was the first time I’d ever heard somebody do a brand breakdown of what is compelling about this content and why. And I felt like, “Wow, here we go. Here’s somebody breaking down how the storytelling component actually works: why this stuff is resonant, why this type of charisma is so appealing to disaffected kids who then go and actually join the movement or go and commit atrocities or attacks.” And I felt like it kind of connected the dots for me. It was one of those formative experiences. I feel like I can intuit and bring my quantitative approach to studying it, and then here’s this other person who’s like, “Let me tell you how this appeal actually functions and why these slogans work.”
The problem though, at the end of the day, was what was the U.S. government supposed to do about it? And that I think remained a problem to a large extent because what they were doing about it was they were tweeting back at the terrorists with attributed State Department handles and things like this. But what it misses again is: Who is the messenger? They had this hashtag “Think Again Turn Away.” And the idea that a U.S. government diplomat in DC — or not even a diplomat, just a staffer in D.C — tweeting at you about how you shouldn’t go join ISIS, as if that’s going to counter this extraordinary video game-style propaganda recruitment material that they’re constantly pushing out, or the personal engagement they’re having with you on Kik… It was one of these things that showed the limitations of government counter-speech, the lack of ability to galvanize an immediate movement to do the countering.
There were some really interesting counter-trolling movements that began to come out, particularly out of Japan; I think they called it maybe “ISIS Sun” or something. I’m getting in the internet culture weeds here, but it’s like the NAFO troll guys do with the Russians today, with the little Shina Ibu. Can you use trolling effectively to be that counter-movement? The squares in the government telling you not to do a thing are not going to compel you. But is there a way to create and galvanize a movement of and pool content producers? You did start to see some of these grassroots efforts to do that, but ultimately it was very similar. I think that problem continued.
Geoff Kabaservice: We could talk all day about how the government could have responded better to the COVID outbreak of misinformation, but can you just tell me briefly how a more adept government, a more adept CDC or WHO, might have responded to these online rumors spreading?
Renée DiResta: Well, in order to respond to a rumor, you have to be quick, and this is one of the real challenges. Rumors have always existed, obviously. The definition of a rumor is basically unverified information passed from person to person. People share them because it’s a pro-social behavior: you’re part of a group, you’re sharing something that you think is interesting that your community should know. And this was happening constantly as the COVID vaccines came out. One of the problems with a lot of the thinking about responses is that there’s an assumption that it is misinformation, meaning something that is factually wrong, when what was happening most often was that it was stuff that was unknowable. And in order to respond to something that is unknowable, you have to participate fairly quickly. And in addition to participating fairly quickly, you have to be seen as trustworthy.
Trustworthiness comes from transparency and also frequent communication. You’re not going to have… Again, I think there’s some legacy trust in the institution. But as the institution is reticent, doesn’t communicate, or communicates three days later and appears to be leading from behind, that doesn’t continue to inspire trust. So rather than being out there with things like, “Hey, this is what we know in this moment about origins, vaccine safety” — you name it, there were so many different moments: masking, is it airborne, what do you do on an airplane anyway? — and all these different questions, they still moved through a much more traditional style of communication.
And so you started to see frontline physicians do that organic counter-response. The problem that they have is that if you’re counter-speaking and you’re both sitting next to each other at a table, people might find the expert much more persuasive. But when you have a networked communication environment, the amplification piece is so critical to the message even being seen. The messages from random doctors who decided that they’re going to get on the platform and counter-speak that week, they have what, twelve followers? Compared to the influencer who’s been a wellness influencer for the last five years, a political influencer for the last two years — they have millions of followers. And so the difference, again, it’s the investment in the network over time and the prioritization of building the network that is so foundational. When we talk about counter-speech, you have to understand that there’s really a structural difference. It’s not two opinions being presented on a news show in that moment. It’s a very different process.
Berin Szoka: Renée, you say here that “Americans really want to believe in the marketplace of ideas, but social media today is really more like a gladiatorial arena.” What did you mean by that?
Renée DiResta: Two things. First, the platforms incentivize combativeness. Twitter in particular incentivizes combativeness. That is a function of two things. One, there are two different levels of attention-grabbing potential that happened on Twitter. Is your tweet curated into other people’s feeds? Morally strident language tends to be pushed out to more people. Influencers are really good at this. They’ll literally sit there kind of A-B testing the algorithm to see what kind of stuff gets pushed into feed. Sometimes adding an image is great, sometimes adding a link is better. It really depends on the platform. So first it’s just getting pushed into the feed.
Then it’s getting pickup in hopes that you’re going to get into trends, which is going to get it… The stuff that’s going to be pushed out in the feed is going to go primarily to your followers. Not on all platforms; on a platform like TikTok, it might go out to random people. But in order to break out of your group of followers, you need that networked pickup. So person A tweets something, ideologically aligned person B picks it up, ideologically aligned person C picks it up. Eventually the trending algorithm picks it up, and then once it shows up in trends, it will continue to snowball. So that dynamic is one piece of it.
The other thing, though, is that when you have that morally strident language over and over and over again, social science studies show that people who use out-group attack language tend to get more pickup. So it’s not only morally righteous indignation about some wrong that has happened to you, but it is also like “Those people over there did this to you. Those people did this to us.” And once you add that element of the out-group in, you’re creating an opportunity for your fans who are there to have that fight with you to go and attack those other people, and then those other people are going to respond. And that’s how you start to see these big ideological brawls that constantly happen on Twitter.
Berin Szoka: This I think is the key detail in your book. You say that those people tend to get more pickup, but that’s because of the particular architecture of how these services are designed.
Renée DiResta:
Exactly.
Berin Szoka: Larry Lessig famously said that “code is law.” And what he meant was that the way you design a service really shapes how it functions. It is the governing environment. So what is it about how the services are coded today that creates this gladiatorial arena, and how do you think it could be improved?
Renée DiResta: It’s a problem of incentives. The platform wants people to be there, and so for a very long time, servicing is… Platform measures of engagement, particularly from I would say maybe through to about 2018… Eventually around 2018, they start to have this concept of what they call “healthy conversations,” where it’s not just engagement that they’re going for but they start trying to optimize for what they think of as healthy, meaning that there is not a significant amount of in-group demonization in what they’re surfacing. This is where they start to realize that hate speech and things like that might actually be a problem. But what they’re essentially trying to do is keep people onsite, and keeping people fighting is a way to keep them onsite.
Berin Szoka: It seems to me what they’re trying to do is not only keep people onsite but ultimately to sell ads, which is a function of keeping people onsite but also making sure that advertisers want the attention that their content is getting, that they’re comfortable with the content that their ads are being shown next to. It seems like there are conflicting incentives here. And of course, we’ve seen that this fight over content moderation and curation has taken on a really clear economic dimension. Just today I think it was, Ben Shapiro was testifying in front of Congress claiming that there’s some sort of global advertising conspiracy against conservative content. He demonizes in particular a group called the Global Alliance for Responsible Media, which is an organization that has collaborated among global advertisers to set basic standards for what brands are comfortable with. And they claim, people like Ben Shapiro and his allies on the Hill, that this is a way to demonetize and take advertising revenue from supposedly conservative content.
Renée DiResta: That’s extended even to… If I’m not mistaken, Jim Jordan’s weaponization committee sent a letter to that advertising group alleging that they were also trying to somehow kill Twitter by withholding ads.
Berin Szoka: Indeed, Twitter has sued some organizations, most notably the Center for Countering Digital Hate in the United Kingdom. They brought a lawsuit against that organization that was tossed out as laughable. So we see these fights, we’ve seen some people have been successful in standing up to that kind of attack. Others… The Stanford internet Observatory shut down, essentially. You described earlier that they’ve decided to refocus on child protection. I interpret what they’re doing as refocusing on things that don’t make Republicans mad, so that Jim Jordan will stop attacking them and they won’t have to keep spending money on this and risk alienating donors.
Renée DiResta: It’s funny… It didn’t go so well from that strategic front, because when the stories about the refocusing or whatever broke, first Jim Jordan declared victory, with tweets about how he had exercised robust oversight over the Stanford Internet Observatory — which is not Jim Jordan’s remit, just to be clear. And second, he promptly sent another letter reiterating that SIO was still under subpoena and he was going to continue to do whatever inquiries he wanted. And I think he wanted an explanation of whether we were really shutting down and all these other things.
Again, I’m not there anymore to know how they’re going to handle all of that. But it was the most counterproductive thing, as far as thinking that you’re going to win by retreating. This is where some of my frustration in the book with institutions continues. Because that’s just not an approach. That communicates that these kinds of attacks work, that this playbook will work on anybody. And from higher education in particular, and institutions that are supposed to be committed to academic freedom and freedom of inquiry, I think that kind of capitulation is actually terrible. I think it sends a terrible signal.
Geoff Kabaservice: Let’s rewind a little bit here. In 2020, you were with the Stanford Internet Observatory and you, together with the University of Washington Center for Informed Public and the Atlantic Council’s Digital Forensic Research Lab, formed the Election Integrity Partnership. And the aim, as I understand it, was that you would focus on understanding viral narratives that might impact people’s right to vote or shape public opinion about the legitimacy of the election itself. And this work eventually would bring you into the crosshairs of Ohio’s Republican Representative Jim Jordan, who since 2023 has been chair of the Select Subcommittee on the Weaponization of the Federal Government.
Berin Szoka: For the weaponization of the federal government, actually.
Geoff Kabaservice: Can you tell me how this all transpired?
Renée DiResta: So the summer prior, in the summer of 2022, we were actually looking at election 2022 at the time, and again, watching viral narratives. A little bit of a different project… I was very much interested in things like Birdwatch and how community notes worked. Anyway, different types of work, but studying election narratives. As we were doing that, this think tank, so to speak, popped up out of nowhere. You can make a blog and call it a think tank; this is how the internet works. And this individual, Mike Benz, said at the time he had headed cyber at the State Department. This turned out not to be true. He apparently had worked at State Department for two months. But using his affiliation, that sort of credential, he began to write all of these blog posts alleging that he was kind of a whistleblower, positioning himself as a whistleblower, revealing the presence of a vast cabal. And this vast cabal was our Election Integrity Partnership, these four academic institutions.
And using our own report, where we had publicly put out nearly two years prior material documenting the work that we had done, he picked random sentences and random facts out of the report, random numbers, decontextualized them or recontextualized them, and reframed the entire project as some sort of secretive cabal. Again, just to emphasize, he did this using our own public report that had sat on the internet for two years.
So that was the summer prior. We kind of ignored it, honestly. We started getting these inquiries from the news and some of these media outlets that were willing to launder bullshit — using the academic definition of the term, information published without regard for the truth — just sort of laundering that stuff into the right-wing media.
This of course got a lot of pickup in right-wing media. Because as Berin notes, this idea that there was a vast censorship cabal censoring conservatives reinforced the priors that they’d been hearing for many, many years, including all throughout 2020, that there was a vast plot to silence them. And now we were being held up as “Here are the people here, these are the plotters.”
It’s hard to know in those moments how to respond. You don’t want to give credence to right-wing media. But it turns out what happens is when you respond to questions — they send a litany of questions, and we responded to the questions. And I remember in one case, they asked us something about our relationship, which government agencies we had been speaking with. And we didn’t, in our response, note some engagement we had with the State Department. We had mentioned all the others, but we didn’t mention State. It was for not any deliberate reason, it was that there was this Gish gallop, this litany of questions. And as we responded to the questions, we simply had not responded to that one. But they seized on the absence of a response and decided that that was where… We weren’t responding to that one because that was where we were guilty. And that perpetuated a whole other media cycle. You’re kind of damned if you do, damned if you don’t on the response front.
But what winds up happening, concurrently with all of this nonsense happening in the right-wing media ecosystem, is that this man is advocating that a committee with subpoena power investigate us. The house flips in November. Elon Musk has bought Twitter. The “Twitter Files” fake exposés are underway. And what you see happen there is this man, Mike Benz, the State Department non-head of cyber, connects with Matt Taibbi in a Twitter Spaces and says, “Oh, I’m going to tell you all about Renée DiResta and her CIA ties.”
Geoff Kabaservice: Perhaps explain for those who don’t know who Matt Taibbi is?
Renée DiResta: Matt Taibbi is a Substack writer, and he was a journalist at Rolling Stone and a few other places. He’s perhaps best known for a metaphor about Goldman Sachs being a “vampire squid.” Anyway, he’s at Substack now, and he writes this newsletter. He was awarded access to Twitter’s internal emails. The Twitter Files was this exercise in which Elon Musk gave journalists or writers access to Twitter’s internal emails. And so Benz connects with Taibbi, and what happens next is that Jim Jordan holds a hearing to investigate censorship conspiracies against conservatives, and specifically those involving us and our purported cabal.
Matt Taibbi and this other Twitter Files writer Michael Schellenberger testify, and Mike Benz sits behind them in the hearing. And then they just sort of regurgitate all of the material from this foundation’s blog. All of these weird takes and mischaracterizations of our work are now all of a sudden under oath, entered into the Congressional Record. And for me, I was like, “Wow, it’s really that easy, isn’t it?”
Geoff Kabaservice: The way you put it: “It was Matt Taibbi said something in a Twitter thread, and Jim Jordan got to read my emails.”
Renée DiResta: That’s literally it, that’s it. That’s what happened because all of a sudden, the day after the hearing, the letter arrives from Jim Jordan demanding all of our emails, my emails. And I was like, “And now we have to turn this over?” And the answer is yes, it turns out. Because we didn’t turn them over fast enough — I think two weeks go by — we get a subpoena. So now we’re under subpoena. And I was like, “This is the most incredible thing. This is all bullshit. There is no evidence here.” They were saying things like, “They censored 22 million tweets.” And I was like, “Is anybody going to surface any of those tweets? Where are the 22 million tweets? Where is the evidence of this? This is most incredible thing I’ve ever heard.”
It turns out we had, in our report that sat on the internet for two years after the election, we had counted up the most viral (meaning most successful) election rumors of the 2020 election: things like Dominion, things like Sharpiegate, the idea that Trump supporters were disenfranchised through Sharpie markers that machines wouldn’t read. All of these crazy theories: Hammer and Scorecard, that CIA supercomputers had changed votes, all of the most viral things. I think anybody listening to this who was marginally aware of the 2020 election has heard these stories. This blogger alleges that we got all of those stories nuked from the internet. And when we added up the number of tweets about them, we arrived at a number of 22 million. And that’s the number in our report. So he says that we censored 22 million tweets.
Again, it takes so much effort to explain what is at this point deep lore for this group of people who have been following this serialized story for over a year. Normal people have no idea that this is happening, and you sound like a crazy person when you try to explain it. But unfortunately, the subpoenas begin and then the lawsuits begin. And then all of a sudden not only are all of your emails and documents being turned over under subpoena, but then Stephen Miller sues us, America First Legal sues us, and so now we have civil litigation. And then Jim Jordan materials that we turn over under subpoena somehow make their way over to America First Legal. It’s just the most incredible thing. So we’re being accused of cabaling and collusion and an intertwinement as a machine is operating over here with the flimsiest pretexts. And yet allegations on the internet, it turns out, are really all it takes at this point.
Geoff Kabaservice: You wrote that “Institutions are ill-equipped to respond to bad-faith attacks.” I think it’s also fair to say that the whole system of our politics is ill-equipped to respond to bad-faith members of government such as Jim Jordan.
Renée DiResta: It was a really interesting experience. You go into rooms sometimes and you’re like, “Oh, the adults are going to come in at some point, and then everything’s going to go back to normal.” And then you realize that no, this is it.
Berin Szoka: Yeah, I had a moment like that myself in 2018. I was invited to testify about Section 230, which in the mythology of this subject area is the reason why there’s censorship on the internet. It’s not because of the First Amendment giving a right to social media sites to make editorial judgments, as the Supreme Court just said in the NetChoice case. No, no, no. It’s because Section 230 allows them to just get lawsuits dismissed when they’re sued for exercising that right quickly.
I was at a hearing about this in 2018. I was there to testify alongside the noted constitutional scholars Diamond and Silk. And I kept thinking, as one Republican after another repeated total nonsense about how there was some kind of right to an audience, and how it would be conservative to have the government come in and intervene in this to protect conservative speech — I kept thinking that eventually somebody would say something reasonable. And somebody finally did. It was Ted Lieu. There was not a single adult on the Republican side of the aisle. Every single person there had bought into the idea that we should have a Fairness Doctrine for the internet — which is of course what Republicans were against; that was the core of their media policy for decades. But now the architecture has changed, and they think that people like you are the new equivalent of the Big Three media that used to control speech.
So my question for you is: Given how deeply polarized this has become, and given how asymmetrical it is between offense and defense in the dynamic that you described, do you see a way out of the increasing political polarization around this topic?
Renée DiResta: I really don’t know. I would like to think that… My experience with this has been that getting the facts out does not stop the attacks. It does not dispel the rumors. It does not return us to reality. People aren’t like, “Oh, okay, well yeah, there we go. That makes sense. I too can go and read that report and see that 22 is the subject of summation at the end of a table. And there’s no there there.” But that doesn’t seem to happen.
And that’s because right now these are power struggles. And as long as struggling for power in this way continues to serve them, they are not incentivized to stop doing it. And that is the thing that I keep coming back to. This is not a technological problem. You know that. This is a human problem. And this is an infrastructure for communication and for attention. And when you have that, people who are powerful, who want more attention and more ability to capture an audience, reach an audience, are going to continue to work the refs. This is basically a gigantic ref-working campaign. And I think “Elect better people” is not very satisfying.
Berin Szoka: You make some recommendations in your book about architectural changes for the services.
Renée DiResta: Services can do something, I think. I thought you meant the politicians.
Berin Szoka: Well, I mean both. Because they interact, right? The politicians behave the way they do because they get attention because of the way that the services work. Do you think that the services changing… You spent a lot of time in your book endorsing federated media and describing how that might improve the situation. Will it?
Renée DiResta: I was editing a paper on that last night until 2:00 A.M. It’s different.
Berin Szoka: What does that mean?
Renée DiResta: Let me explain. Right now there’s the unaccountable private power of very, very large platforms that do have the unique ability to determine what people see. There is an interesting dynamic there in that because they have that curation capacity, they decide what is going to go into your feed in the ways that we talked about earlier. Users don’t have very granular control over that. There are different ways to have social media. Mike Masnick wrote this foundational essay back in 2019, “Protocols, Not Platforms,” where what he basically argues is that you can have much more decentralized, federated social media — Mastodon is an example of this — where different people can run different servers and can set different rules on different servers. So you can go and you can find yourself a smaller place that’s more in line with your ideological beliefs or moderation preferences or what have you.
There’s another approach which is called middleware, which argues that you can actually create an entire marketplace of third-party providers who build curation feeds or moderation feeds that you can subscribe to. So if I wanted my Twitter feed to prioritize tweets from women, for example, that’s a thing that I can subscribe to: a middleware provider who would do that, who’d build the capacity to do that. So the idea is that you can devolve control more to users through technological means. One of the arguments that we’re trying to make in this paper is to look at how middleware and devolution of control down to users would work in a centralized or decentralized environment.
But what you’re getting at is: Is it better? I mean, there are some really significant trade-offs. One of the challenges with the Fediverse, for example, is that centralized moderation does some things very well. CSAM, for example, Child Sexual Abuse Materials — it is large platforms that have access to certain types of technologies that enable them to moderate that much more effectively. It is also very illegal. And smaller servers like you, the random server on Mastodon, you would be responsible for that — and that puts a lot of onus on the small server operator who’s mostly a hobbyist who has something else that they’re doing. So there’s no professionalized moderators.
And CSAM is obviously the clearest example because of the illegality of it. But when you have other aspects of content moderation, there are some good things about centralized moderation. Doxing, for example, is treated very quickly. That’s not actually illegal on smaller servers; it’s very hard to deal with. So I think what we’re seeing is people who are dissatisfied with the status quo looking for alternatives; alternatives are beginning to emerge as technology enables them; the alternatives themselves will have trade-offs. And so some of what I’ve been writing about lately with a think tank that focuses on markets quite a bit: Is there a market-driven approach to enabling this newer mechanism for engaging on the internet?
Geoff Kabaservice: Renée, I have a sort of final question. It’s well known that technological innovations, particularly in communications, can have very significant economic, political, social impacts. Gutenberg invents the printing press, this leads to the Reformation, which leads to the Thirty Years’ War, which wipes out a quarter of the population of Germany. One hopes it won’t come to that with social media and the internet. But you do actually draw upon a historical example, toward the end of your book, which suggests maybe some ways in which we can address the situation. And that was the example of Charles Coughlin. Father Coughlin became one of the best-known communicators via the relatively new medium of radio in America in the 1930s. He began as a supporter of FDR and the New Deal, but then actually turned toward fascism and represented that kind of internal threat, frankly, to the United States during the era when Nazi Germany and Mussolini’s Italy were on the rise. And yet you say that the United States did end up addressing the Father Coughlin threat in some interesting and fairly long-lasting ways.
Renée DiResta: It did. That was actually where the Fairness Doctrine came from. Father Coughlin’s a really interesting case study because, as you know, there’s a new technological shift, new influential figures who are very adept at using the medium become wildly popular — I think he had 30 million followers at his peak…
Geoff Kabaservice: Out of a population of 120 million in the United States as a whole.
Renée DiResta: Right. He was just an incredibly influential figure. He’s trusted, he’s a priest. Even people who were not Catholics were also listeners, in large part because he was a patriot; he had this very kind of patriotic tone. Also this was during and after the Great Depression, so the sort of populism of the moment was really a populism that aimed to help the downtrodden, the people who had just had this life-altering terrible experience, and so they saw him as a source of hope. Then he becomes a fascist, as you note, and literally he becomes a pro-Nazi sympathizer. And he begins to make comments…. He tries to… Kristallnacht, I don’t remember the exact date…
Geoff Kabaservice: November 1938.
Renée DiResta: Thank you. What he winds up doing is blaming it on the Jews, so he kind of flips the victim and the offender there. And the broadcasters are horrified. This is seen as really crossing a line. And then they have to decide: Are they going to yank his license? So they start… It’s very much similar to content moderation. It starts very small. It starts with you lose access to your next broadcast. Your broadcasts have to be pre-approved. You have to send us your speeches. There’s a fact-checker who comes on after his speeches and says, “Father Coughlin lied to you about this, this, and this.” FDR meanwhile, as president, doesn’t want to be seen as intervening — because of the First Amendment, obviously, and also because Coughlin is a prominent critic of FDR and he doesn’t want to be seen as using his power to suppress the speech of a political rival.
So the broadcasters… This is where you start to see them actually rewriting the rules for what kind of content is appropriate to be expressed in a broadcast, and whether or not certain controversial viewpoints should have an alternative perspective presented immediately after them. So you actually see the broadcasters trying to prevent this from happening again. What ultimately reduces Father Coughlin’s power, though, interestingly, is that the Catholic Church begins to act. You see his supervisors within the Church saying, “Okay, enough. You’re done. You’re going to focus on your religious duties again.” And he does comply with that. But in the meantime, he has a small paramilitary organization that has risen around him. The U.S. does eventually move, of course, into World War II, so there’s some historical shifts that happen. But it is in a sense the sort of in-group moderation, if you will, of the leadership, of the people who have some power over him exercising it in the way that they do.
In the meantime, though, in order to educate the public about what’s happening, a group forms called the Institute for Propaganda Analysis, which aims to educate the public — not in an effort to fact-check Father Coughlin’s speeches; the broadcasters have that on lock. But what they’re doing instead is they’re just saying, “Here’s how the rhetoric of propaganda works. These are the tactics that propagandists use that you should recognize, so when you hear a speech by Father Coughlin, the reason it works on you is” — and then they come up with these little breakdowns, they give them phrases like “the glittering generality.” And they kind of articulate it in pamphlets. They’re literally passing out these pamphlets. It’s a media literacy effort.
Geoff Kabaservice: And also a civic education project.
Renée DiResta: Exactly. And they’re actually… I went and I pulled the archives on these, and they actually used emoji. I don’t know why. I was like, “This is absolutely incredible.” First of all, I didn’t know this was even possible, that you could print little tiny images in line with text in the 1930s — you can, apparently. And when you look at these pamphlets, it was like, why aren’t we doing this now? It seems like rhetoric and trope education would be highly useful today — because again, the speech patterns of demagogues don’t change. So why aren’t we doing a better job explaining to people what they look like?
Berin Szoka: So to distill the Father Coughlin story, it sounds like civil society and market forces work together to counter a demagogue. And today the war on content moderation is intended to stop exactly that from happening: to stop civil society, to stop market forces from pushing back against this kind of noxious content, to keep the new Father Coughlins online.
Renée DiResta: Right. That’s correct.
Geoff Kabaservice: Well, thank you so much, Renée DiResta for joining us today. Thank you, Berin Szoka, for your incisive questions. And congratulations, Renée on the publication of Invisible Rulers: The People Who Turn Lies Into Reality. It’s been great to have you with us today.
Renée DiResta: Thank you so much.
Geoff Kabaservice: And thank you all for listening to the Vital Center podcast. Please subscribe and rate us on your preferred podcasting platform. And if you have any questions, comments, or other responses, please include them along with your rating or send us an email at contact@niskanencenter.org. Thanks as always to our technical director, Kristie Eshelman, our sound engineer, Ray Ingenieri, and the Niskanen Center in Washington, D.C.