Skip to main content

Inside Filippo Menczer’s Toolbox for Disassembling Misinformation Campaigns

headshot of Filippo Menczer
AAAS Member Filippo Menczer, Ph.D.

In another time, another place, AAAS Member Filippo Menczer, Ph.D., squints at his lab mate’s computer monitor in a San Diego lab. He’s looking at the Mona Lisa, Da Vinci’s painting that lives in the Louvre Museum of Paris more than 5,500 miles away.

Such was Menczer’s first encounter — one he says he’ll never forget — with the web.

Today, the computer scientist and distinguished professor at the Indiana University Luddy School of Informatics, Computing, and Engineering studies a problem conceived by the network he marvelled over two decades ago — deceit on social media. Menczer’s team builds tools to analyze and counter misinformation, disinformation, and manipulation on social networks. It’s work that has become increasingly important, says Menczer.

“From social media, you can get a pulse of society,” notes Menczer. “You can see what people are thinking, you can track their opinions… and you can also abuse that.”

At the Observatory on Social Media (OSoMe), Menczer oversees research on the role of media and technology in society and designs tools for unraveling the treacherous webs spun by bad actors. These tools include Botometer, a bot detection tool that sniffs out fake accounts called social bots, and Hoaxy, a search engine that visualizes the spread of information, true and false, on Twitter. Both were used to map out the role of social bots in spreading untrustworthy content during the 2016 presidential election and to take down voter-suppression bots.

Menczer says that figuring out how best to control misinformation is imperative for the health of science and society. By understanding how and why people are vulnerable to misinformation, scientists can build countermeasures for it.

“When people are manipulated, they listen to bad actors and that leads to bad policy,” explains Menczer. “As scientists, we need to be very concerned about the fact that when people are misinformed and they don't believe scientists, they end up supporting politicians that push policies that are proven by science to be dangerous.”

Menczer has had his eye on bad actors since the Internet’s early days, and it’s his natural aptitude for making sense of complex systems that has earned him his reputation as a leader in the study of web science and social networks. Menczer earned his undergraduate degree in physics from the Sapienza University of Rome. Driven by his love of artificial life and algorithms, he left Italy for San Diego, where he earned a master’s in computer science from the University of California San Diego and a subsequent doctorate in computer and cognitive science.

During his Ph.D., Menczer began working on adaptive web crawlers — computer programs that explore and index the web — that could learn an internet user’s interests to optimize their search results. In 1998, he joined the faculty at the University of Iowa as an assistant professor in the Department of Management Sciences, where he worked on data mining, AI, and machine learning. Menczer landed an associate professor role at Indiana University in 2003, the same year that social media platform MySpace launched and became the first social network to reach a global audience. At Indiana University’s School of Informatics, Menczer bore witness to the rise of platforms like Facebook, Twitter, and Instagram, as well as the good — and bad — that comes with them.

“Between 2009 and 2010 is when we started really focusing on social media and realized that we could collect a lot of data from it,” says Menczer. “We could see how people interacted with information and how things went viral, and around the same time, we started noticing that social media also had vulnerabilities.”

Menczer and his team began investigating how phishing attacks —in which scammers try to steal sensitive personal data — could exploit data leveraged from social media by posing as a person’s Facebook friend in email messages. Their published study motivated Facebook to add privacy protections to its platform. That’s just one example from a long list of contributions that Menczer has made to the study of web science, social networks and data mining. He has received a Career Award from the National Science Foundation and was recently named a Fellow of the Association for Computing Machinery (ACM) in 2020.

Ask Menczer what the solution to quashing misinformation and increasing confidence in science, and you may feel a little disappointed by his answer.

“So far, there is no silver bullet,” he says. “However, there are lots of things that we think probably help and should become part of the solution.”

Teaching consumers how to distinguish fake news from real news — or how to be news literate — is a start. Games that teach news literacy can result in small but significant improvements in a person’s ability to determine fact from fiction online. Encouraging platforms to increase moderation is another piece of the puzzle. So is having meaningful conversations about the societal cost of producing information on such a massive scale.

At the end of the day, misinformation will always be around, says Menczer, but finding ways of creating a healthier information ecosystem online can help us lessen its negative impact on science and society.

“Technology unfortunately provides ways to amplify the spread of misinformation,” says Menczer. “Do we really want it to be so cheap to produce information at such a huge volume and reach? Or do we want to add friction, maybe in that you have to prove that you're a human before your message gets seen by more than 100 people? These are questions we need to ask.”

Blog Name