With his expertise in physics, engineering, computer science, and sociology, AAAS Fellow Duncan Watts, Ph.D., says he’s attracted to research that doesn't fit neatly into one scientific bucket. His heterogeneous experiences, from academia, to industry (Yahoo and Microsoft), to the Royal Australian Navy, have all helped prepare him for his work now, unraveling messy, polarizing issues like disinformation, bias, and mistrust.
“It's all been useful for what I'm doing now. I've always been drawn to problems that sort of sit in the cracks in between fields. I think it has given me a different perspective. I've just learned about how people see things in very different ways,” Watts says. Watts’ Computational Social Science Lab at the University of Pennsylvania is a joint venture of the School of Engineering and Applied Science, the Annenberg School for Communication, and the Wharton School. Its foundation is based on tools like mass collaboration and open, transparent and replicable science.
“Many of the societal problems that we care about are both technological and social in nature: you can't solve any of these problems looking from just one lens. So, in that sense, I feel somewhat justified in having taken the path that I've taken,” says Watts.
Watts studies the misinformation that seems to bombard our world, through television, the internet, and social media. It can be subtle, and destructive, but he says it seldom fits the now ubiquitous label of “fake news.” More often it is a biased presentation of information to fit the agenda of the company or individual presenting it.
“Fake news” in the research literature is defined as deliberately engineered, 100% false content that is made to look like real news. Watts says it is rare. Instead, what millions see every day on TV and the internet is at least partly true but is often biased, presenting a particular point of view while ignoring others.
News editors and broadcast producers choose when they want to talk about an issue, which facts to highlight and what to ignore. They also choose whose opinions to promote and whose to challenge. So, very different stories can be told about the same underlying “facts.” The result is not necessarily false, Watts says, but can still be misleading.
“I don't know why anybody bothers lying. If you lie, you can get caught, and then you look bad. But if you just use biased information, you can say, ‘I didn’t say anything false, and I’m entitled to my opinion about what is important. There's nothing wrong here, nothing to see.’ Many people, including many journalists, would agree with this response, but from the perspective of public opinion and understanding it’s still a problem. When millions of people are getting biased information from organizations they trust, that can have a much larger effect on misunderstanding than when much smaller groups of people fall for flat out false information” he says.
Wildly different versions of facts taught Watts and other social scientists many hard lessons during the COVID-19 pandemic. The refusal by so many in the U.S. to get vaccinated was especially hard to grasp.
“Psychological and social factors were incredibly important, and everybody underestimated their importance. I think it’s fair to say that policy makers in general never anticipated that a sizeable fraction of the population wouldn't want to get vaccinated even after vaccines had become widely and freely available. It's just an astounding thing,” Watts notes.
Because social scientists are now aware that pandemic preparedness is every bit as much social and psychological as it is biological, they are making it a research priority to prepare for the next pandemic. Watts’ team, for instance, is working on a project that could improve public health experts' ability to monitor future disease outbreaks. They are modifying an existing business application that taps into huge amounts of data collected from GPS pings on cell phones.
While there are privacy concerns, Watts says there’s also promise from this computational social science tool for tracking a disease outbreak. It could document when people are at home versus when they are traveling, keep track of how many people are gathering in particular locations and map other crucial data that might be relevant to epidemiologists.
“I think it has tremendous potential to really change the way we model how disease is spread, and how we develop interventions to slow down disease spread by closing down certain venues or industries, or making recommendations about stay-at-home policies,” explains Watts.
When it comes to communicating information about the coronavirus, Watts also believes the scientific community needs to make some changes to maintain public trust. And that means taking a hard look at the very fundamentals of the research and peer review process that has been the status quo for generations. A movement that started in psychology, and has spread to other social sciences, has been examining whether previously published findings are replicable or not.
On most research projects, he explained, scientists have all kinds of analytical flexibility in going from a general hypothesis to a very specific statistical test. That means a vast number of choices, from how subjects are tested, in what kind of experiment, to what kind of lab is used. Other decisions include when to stop gathering data, and what to do if you don't get the answer you expect.
“And because throughout this process there is a bias towards finding something interesting and publishable, you get false positives. Ultimately the consequence is that we may not actually know much of what we think we know. Maybe we do and maybe don’t; the problem is that we have no way of knowing just from the literature,” he says.
That means revisiting a lot of things that we thought were settled and confirming prior conclusions are still valid.
“At the end of the day, at least we should be able to say with more confidence that people should trust us,” says Watts.