5 Lessons for Pandemic Misinformation Research
May, 2020
How can researchers contribute to action and public understanding during fast-moving events like elections, pandemics, and other crises?
Friday at the Cornell department of Communication, we heard from Dr. Claire Wardle. Dr. Wardle is a leading expert on social media, user generated content, and verification. Her research sits at the increasingly visible and critical intersection of technology, communications theory, and mass and social media. Claire is the co-founder and leader of First Draft, a nonprofit focused on research and practice to address mis- and disinformation.
Claire tells us that she founded First Draft was founded in 2015 to help journalists learn how to verify content online. If a chemical weapons attack happens in Syria and all you have to go on is YouTube content, how do you verify that? Since then, they have grown to a team of 40 that supports newsrooms, policymakers, and tech platforms internationally.
What is Misinformation?
What is misinformation? Back in 2016 when President Trump called CNN “Fake News,” it became the phrase of the year. Most misinformation is genuine, it’s used out content, and most of it isn’t news. Not only does this term not explain the need, it was also weaponized by political leaders. So I created a typology for different kinds of “fake news.” Because rumor, lies, propaganda, fabricated content, and false content are all different, we need clear thinking to be able to imagine interventions.
Before the pandemic, people were often nonplussed about the very real effects of mis and disinformation. During the pandemic, those harms have become much more clear.
Producing Knowledge at the Pace of Need
Claire tells us that one of her frustrations with misinformation research is that it’s mostly done in labs with American undergraduates. First Draft works with journalists to understand what people are experiencing in their lives, test ideas for intervening, and support journalists and tech platforms to understand what to do.
When we think about the pandemic, Claire says, we should be thinking in terms of crisis communications rather than addressing falsehoods with facts. Unfortunately, many conversations about misinformation were focused on cases of certainty rather than situations of uncertainty.
Journalists and public health communicators were very unprepared for the pandemic, Claire tells us. Bad actors can create memes, emotion, diagrams, and narratives to create media with high production values that people will share widely. Journalists and researchers aren’t as well resourced or experienced at creating shareable content, and we need .

How are Journalists Covering Misinformation?
Journalists haven’t traditionally covered false information- it previously wouldn’t get covered, says Claire. But now, as Whitney Philips has argued, media manipulators *want* journalists to cover their material and attempt to debunk them.
Claire warns that fact-checking attempts can broaden the audience for disinformation. That’s what has happened with the recent anti-vax “Plandemic” documentary. It’s been around for a while, but once its supporters got it to trend on Twitter, journalists felt like they had to cover the documentary and give it more attention.
How are Platforms Navigating the Pandemic?
When they were founded, tech companies wanted to achieve scale as quickly as possible. At the time, they had no understanding of the challenges of being available across cultures. In the last decade, companies have learned the difficulty of playing the role of arbiters of truth. On most kinds of speech, there are usually two sides. During the pandemic, there were six weeks during which companies had a free hand because no one was saying they wanted more pandemic misinformation. As governments and citizens debate pandemic-management policies, that standpoint is disappearing. And discourse about the pandemic is completely legal. So their job is about to become much harder.
Claire talks about the lack of independent oversight over platforms; they will often refuse to provide even the most basic information about their power in the world. Facebook has just announced their oversight board; Claire gives them three months to demonstrate how well their system will work.
How Do You Persist in Doing This Intense Work?
You have to start by remembering the timescale of change, Claire tells us. When people are in a crisis, they often want a quick fix to complex problems. We have to accept that we live in an information polluted environment. There will never be an internet where all problematic speech disappears, and that wouldn’t be a free society. So even as we respond quickly, we need to be prepared for the long game.