When you think about online disinformation, the first sites that come to mind might be 4chan or Reddit.
If you talk to Danny Rogers, co-founder and Chief Technical Officer of the Global Disinformation Index, a very different platform comes up.
It’s Etsy – that vaunted purveyor of crafters, knitters and jewelry makers.
“Go to Etsy,” Rogers said. “Type ‘QAnon.’ You’ll see pages of QAnon merchandise. QAnon has led to people’s deaths. That certainly has been monetized through the Etsy.com platform.”
Rogers, a AAAS Member and quantum physicist with research background in quantum cryptography, went to work early in his career for the Johns Hopkins University Applied Physics Laboratory. He managed technology projects for the Department of Defense and the intelligence community, which included initiatives in radio frequency signal processing, and navigation and geolocation techniques.
In 2017, he co-founded the nonprofit GDI to respond to the surge of weaponized disinformation that appeared before the 2016 presidential election. He noted the most high-profile case: the troll farm known as the Internet Research Agency, which employed hundreds of people to amplify hoaxes and conflicts on social media to undermine election information.
“I saw the writing on the wall of real harm being done,” said Rogers, who also teaches as an adjunct professor at New York University’s Center for Global Affairs. He sees the work of the Internet Research Agency, and other similar players as “brazen attacks on our information environment.”
Today, he is focused on “de-platforming thousands of actors,” he said, “including potential acts of deadly violence.” For the sake of security, he can’t go into detail, but he said the GDI’s efforts are paying off.
While Rogers is focused on stopping disinformation, his background in quantum physics plays a significant role in that work.
Studying statistical mechanics, Rogers said, lends understanding to “how micro-level individual behaviors or interactions can scale up into macro-level phenomenon.”
For example, a physicist using statistical mechanics would observe the behavior of individual atomic particles to determine whether they would form a liquid or a solid when combined at a larger scale.
With data, Rogers said, this same approach can be used to predict “large-scale effects from small-scale activities,” such as traffic jams that arise from the behaviors of individual drivers, or how someone might vote when exposed to a certain information environment.
Now, he is applying this type of analysis to getting bad actors de-platformed across advertising systems and social media.
GDI’s system analyzes content and context flags which can help assess any domain’s risk of disinformation. It rates the disinformation risk for websites based upon variables that include overall credibility, whether they push sensationalism, whether they contain hate speech, and whether the company embraces sound policies regarding content.
Another strategy for GDI? Ad tech.
“We’re doing everything we can in ad tech – providing block lists and recommending policy change,” Rogers said.
Web companies apply GDI’s technology to the hard-to-track, Wild West environment of online advertising. The nonprofit provides companies with information about the activities of malicious actors by using automated “classifiers” that work through machine learning to identify information that comes from “junk domains.” In a test, “the prototype classifier identified 98.8 percent of domains that had been pre-labeled as junk,” according to a research paper produced by the group.
To track the sources and validity of advertising, the GDI software checks the sites from which the ads originate and determines whether they are high-quality sites or high-risk sites.
Once companies have the data, they can decide whether to allow domains with high “junk” ratings to advertise on their sites. As a result, not only could sites cut off funding to nefarious actors, but they could also re-direct valuable ad dollars to quality news sites.
The GDI is supported by the Foreign & Commonwealth Office of the United Kingdom, the Knight Foundation, the Charles Koch Institute, Luminate (a business accelerator), and Meedan (a nonprofit that builds verification software).
Today, in the ongoing crisis of a global pandemic, Rogers sees the work of GDI as only growing.
“I’ve been calling it (COVID-19) the Super Bowl of disinformation,” Rogers said. “It is overtaking everything at this point.”
Rogers cited domestic actors hawking bunk cures, people weaponizing stories to incite racial violence, conspiracy theorists, and threats to public order and the geopolitical order. The goal, he said, is to conflate differences.
Ask him who the players are, and he’ll laugh.
“Everyone,” he said.
Rogers sees the harm very clearly – that people may live or die, or be sick or well, based upon the information they receive. He predicts a direct correlation of how people’s information environments will factor into their health outcomes, especially as social networks look to keep people online as long as possible.
“All the services are competing for your attention,” Rogers said. “Everyone is trying to show you a more personalized vision. Algorithms are trying to measure what is most engaging. They’re all there to try to keep you staring at that screen.”
As a result, he said, rather than having a shared information experience – the way previous generations did when they flipped on the evening news – everyone is living in his own rabbit hole.
“Suddenly,” Rogers said, “no one lives in the same reality anymore.”
To learn more about Rogers' thoughts on Covid-19 and misinformation, tune into his AAAS Community Chat on April 30.