Christie Aschwanden is a freelance writer and author of Good to Go: What the Athlete in All of Us Can Learn From the Strange Science of Recovery. Aschwanden won a 2016 AAAS Kavli Science Journalism Silver Award for three stories focused on data manipulation, and her pandemic reporting has continued to look critically at statistical analysis of COVID-19 data. Aschwanden’s most recent work has addressed everything from COVID-19 testing accuracy to false death count inflation claims, and included working with the National Association for Science Writers to create a free discussion board for journalists covering the pandemic. Emily Hughes, communications associate for the AAAS Kavli awards programs, spoke recently with Aschwanden about data reporting during the pandemic.
Q. As COVID-19 cases spike across the country (and as we start to see more vaccine data coming out), do you see any differences in how journalists are approaching the data now compared with that initial spike back in March?
A. First, journalists now have a much better grounding about the numbers than they did back in March. I think there's a much greater understanding across the board about what these numbers mean, what sorts of questions we need to be asking. At this point it's easy to forget that this is a virus that we didn't even know about eleven months ago. Everything's happening really fast.
And we also have so many journalists now who are covering this, who weren't covering infectious diseases before. Or at least not with this granularity. So, there are a lot of things that everyone needed to learn about. There are things like R-Naught, positivity rates, infection fatality rates, case fatality rates. All these things that are kind of complicated concepts to learn.
Q. Do you feel like you've seen journalists become less hesitant to approach data reporting now?
A. Yes, I think that they're seeking out data in a way that maybe early on they didn't know to do. For instance, not just infection rates but positivity rates are something I think not a lot people really knew what that meant or what they needed to be asking. Whereas now that's something that’s pretty widespread. People know that that's important. There’s been a literacy that's developed regarding some of these terms that has happened over the last six months.
Q. In a story for Scientific American you debunk the myth about COVID-19 death counts being falsely inflated. What led to that myth's popularization?
A. I think it’s very clear. We have a president who is on the record admitting that he has downplayed the risk of the virus. I mean he's said that on tape. And he's also spread false claims about COVID deaths. And then you also have partisan media which has amplified these kinds of myths. And, of course, it goes without saying I guess at this point that we've got the internet which is just awash with conspiracy theories. When my story was published I heard from a lot of people who just seemed immune to facts and distrustful of any kind of authority.
Q. What sort of strategies do you think journalists could take to counteract that?
A. I don't think that the problem is reporting, per say. It’s really the rise of bad actors and misinformation and propaganda on social media. I think the past few years have just driven home how hard it is to counteract those kinds of misinformation campaigns.
What can we do? I think we need to really be very mindful that people think in stories. They don't usually think in numbers. The strategy is to not keep saying the lie that deaths are inflated, but to say here’s why we know there are this many deaths. Sort of stating facts rather than reiterating lies.
Q. You mentioned the importance of building narratives, and the way people think in stories. How do you find a balance between that focus on telling a good story and explaining the underlying math in your reporting?
A. The public doesn’t need to understand the underlying math. I think they do need to know, however, why R-Naught is important and why it’s something that’s being tracked. But they don't necessarily need to walk away from a story understanding how to calculate it.
While we do have a natural bias towards stories, we also have this bias to think that a number is more precise or more accurate than just an idea. So, I think data can be used in a powerful way if you can show people numbers that illustrate the point. The data is sort of a tool for storytelling and a tool for explaining the facts, but it has to go along with story too.
Q. Are outlets changing their approach to data journalism, and telling those stories with the numbers?
A. I do think that we're seeing more charts, more use of data to illustrate ideas which is good. I mean this whole “flatten the curve” idea, in a way it seems like a watershed moment. The fact that the public was able to embrace this complex idea about data ― that's pretty cool. I hope that we will see after the pandemic more use of graphing data and more illustrations using data now that we see the public can understand that.
Q. What basic math skills do you feel are essential for science journalists who are reporting right now?
A. I think of course numeracy, understanding numbers, having some sort of comfort level with them. Being able to look at numbers and see orders of magnitude and whatnot. Understanding averages, means, percentages, base rate is something that’s become really important as we're talking about testing. Absolute versus relative risks are pretty important.
I'm definitely a geek about P-values, and I do think that we should get away from relying on them too much [as a measure of statistical reliability]. It’s important for journalists to understand how to navigate them a little bit and what they can and can't tell us. In general, it’s important for journalists to strive to understand what any particular statistic or figure can do and can't do.
When it comes down to it, I think even more important than particular math skills is a basic understanding of study design, particularly now as we’re into vaccine trials and so many drug trials. For instance, what can we learn from various types of studies, what are the limitations of studies without a control arm? What kinds of practices make a study more or less reliable? It’s important for journalists to understand those questions and to be able to understand a study’s inherent strengths and weaknesses.
Q. What are the biggest takeaways that you've encountered so far in your pandemic reporting?
A. I think one of the big lessons here ― and this is something I totally knew and understood before the pandemic but it's really surfacing [now] ― is that we as human beings are bad at dealing with uncertainty. And that's sort of across the board. But with COVID in particular, this is a really fast, evolving situation.
A lot of this goes to the problem that so much of the public doesn't understand how science works and how it's done. And that every finding is sort of temporary until we learn more. Anything we know can be overturned by new facts, and this is science working as it should. This is not science being unreliable.
Masks are a classic example. Early on people were told don't go out and try to get masks, and part of this is because at that time we had shortages of supplies and the concern was that healthcare workers were not going to be able to get the supplies they needed. We have new information and we now know that masking can be an important thing that people can do and should do.
My state, Colorado, has a state-wide mask mandate right now. From what we know at the moment, that’s a good thing. But what we need the public to understand is that things do change. And that the uncertainty is just an inherent property of this thing that we’re in. It is not something nefarious.
The other thing about this uncertainty is that it’s really given a toehold for misinformation to spread, and that’s dangerous. The stuff about masking is a good example. Anti-maskers say, “well you told us early on not to do it and now they're telling us to do it, therefore they don't know anything and they're just manipulating us.” We need to be open to new evidence.
Q. Have you developed any strategies to help communicate that scientific process, and to deal with uncertainty in your own writing?
A. I feel like so many of the stories I write have this as a thread and an undercurrent. I mean even my book is so much about this. I think that one way of doing it is just to do your best in every case to sort of make clear [any] uncertainties in whatever idea or thing it is that you're talking about.
We don't know right now which vaccine is going to be best. There's a lot of work going on right now to make important decisions about these things, but we're going to make some mistakes and our understanding of this stuff is going to evolve so we need to be ready to expect that.
Q. Do you think that there has been enough transparency around, not just how COVID-19 data has been collected and analyzed, but also around the scientific process from scientists and government organizations?
A. The scientists themselves have been pretty good. We've had an incredible number of preprints being published and, in many cases, discussed in social media. There have been robust discussions and a lot of data sharing, and when people aren't sharing data there's pushback.
I think the big problem we've seen here is access to data. Particularly early on there were cases where the important data wasn't being collected, or wasn't being collected in a way for us to get the information from it that we needed. In fact it was so bad that a couple of journalists at The Atlantic, Robinson Meyer and Alexis Madrigal, actually made their own database. They basically put together this crowdsourced project of journalists, data scientists and volunteers to do this project which is now known as the COVID Tracking Project. They stepped in because the government and the people you would expect to be doing this weren't.
Q. Have you seen a lot of examples of journalists stepping up to collect data or present the science themselves?
A. There was a really great series that just won one of the AAAS Kavli awards. A former colleague of mine was a reporter on the project. It was a Kaiser Health News/AP project where they were looking at what was happening at public health departments. And they did a lot of gumshoe reporting there, you know go out and get those numbers and collect the stuff. It was kind of hiding in plain sight, but they had to do the work of actually getting that information together because no one was doing that. Propublica has been a leader in this, too ― collecting data, then making it available to the public and also to other reporters.