The wide access to information the Internet gives us allows the scientific community to quickly reach millions of people; at the same time, misinformation about science spreads rapidly as well. During the 2022 AAAS Annual Meeting, science communicators and other specialists discussed how we can build trusts between scientists and the public and how we can reduce the power of misinformation and disinformation.
While we often blame uninformed Americans for spreading misinformation about science, panelists at a Friday session titled “Communication is Key: Science’s Role in Sowing Distrust of Science,” discussed how scientists themselves have a responsibility to build trust in science.
“I think we’ve all seen public displays of distrust in science, especially during the pandemic,” Stefan Peterson, a PhD candidate at University of Pennsylvania and the moderator of the session, noted. “While there are many factors that contribute to how people in different communities interacted with science, in this session we wanted to pay particular attention to what scientists are doing to affect the public’s engagement with science.”
Kayla Davis, a AAAS Science & Technology Policy Fellow and instructor at Harvard Medical School, highlighted a Broad Institute study that looked at the impact of genetics and other factors on same-sex behavior.
She noted that the Broad Institute took time to speak to the LGBTQ group at Broad to weigh in and help shape their communication about the research. Yet despite this groundwork, the study’s results were still misused.
“One of the most concerning things that came from this, someone picked up on the information and they actually created an app titled ‘How Gay Are You’ that allows you to update your genetic information that you’ve obtained through a company like 23 And Me or Ancestry.com and get a measurement of how gay you might be. You can imagine all kind of social issues, sociopolitical issues that might come from that!” she said.
Katherine Canfield, a postdoctoral researcher at the Environmental Protection Agency, stressed that researchers need to proactively think about how their research will be received and used by others. “Communication can’t be the afterthought,” she noted. “Thinking about the impacts of our work can’t be something we wait to do until after we’re publishing peer-reviewed studies.”
Another panel that same afternoon called “The Science of Combating Disinformation” looked at the science behind the power of misinformation and disinformation.
David Yanagizawa-Drott, a researcher at the University of Zurich, explained a paper he co-authored where they looked at the impact of two different Fox News opinion programs – one hosted by Tucker Carlson and the other hosted by Sean Hannity – on COVID-19 outcomes.
Yanagizawa-Drott noted that Carlson was more alarmed by COVID-19 than Hannity was during the first few weeks of the pandemic. “Individuals and counties that preferred to view Hannity over Tucker Carlson had radically different behavior initially….they reacted much later in terms of changing behavior…social distancing, washing their hands, etc,” he said.
Beth Goldberg, who works as a researcher at Google, suggested that we think about combating misinformation as similar to inoculating against a virus. She worked with other researchers to create short informational videos designed to inoculate people against manipulation techniques by demonstrating how they work.
“What we were looking for were three outcomes, we wanted to know if our inoculation videos were able to confer greater discernment, or the ability to recognize when there was a manipulative technique, AKA likely misinformation. Did our inoculation video confer trustworthiness in the posts that were less manipulative? And lastly, did it effect sharing? Did it actually effect behavior online just through these 90 second videos?” she explained. “So what we found was our videos did actually effect all three of those outcomes: discernment, trustworthiness, and sharing.”
On Sunday, specialists provided additional potential solutions to the spread of misinformation during a panel titled “When Evidence is Not Enough: The Science of Misinformation.”
Lindsey Juarez, the director of Irrational Labs, offered some optimism about what we can do to push back against misinformation. She described an intervention she helped develop for the social media service TikTok that involves giving users a warning label when they encounter potentially misleading content.
“It’s a very simple message. It says, ‘Caution, this video has been flagged for unverified information,’” she noted. There was also a second intervention -- when a user tried to share one of the videos, they’d be asked if they really wanted to share the video given the flag.
The messages appeared to have an impact. “[We] found that the initial banner reduces views by about five percent, reduces likes by about seven percent, and then huge effects on shares – we found that that reduced them by about twenty four percent,” she noted. “And so really I feel like it’s a small intervention…but really does capitalize on refocusing people on accuracy. And so it feels like a promising intervention.”
[Associated image: Adobe Stock/wachiwit]