Skip to main content

<i>Science</i>: Users Are Main Filters of Political Content on Facebook

Thumbnail
News_20150506_Facebook_full_169.jpg
Facebook users are exposed to more content that challenges their political beliefs than previously suspected, according to the new Science study. | Marcus Quigmire/ CC BY-SA 2.0

Individuals are their own chief censors when it comes to the kind of political news they see on the social media site, according to a new study of 10 million Facebook users.

Individuals' choices regarding what they clicked on limited their exposure to diverse political ideas more than Facebook's methods for ranking posts in a user's News Feed.

However, despite the self-filtering of their News Feeds, Facebook users are still exposed to more content that challenges their beliefs than some have envisioned, researchers report in the 8 May issue of Science.

As people increasingly turn to social networks for news and civic information, questions have been raised about whether this practice leads to the creation of "echo chambers," in which people are exposed only to information from like-minded individuals.

"No studies to date have been able to provide such a comprehensive and fine-grained account of what kinds of information individuals are exposed to [on social media sites], and what they choose to consume," said study co-author Eytan Bakshy, a data scientist on the Core Science Data team at Facebook.

"Our study shows that both liberals and conservatives have a substantial percentage of friends who claim an ideology from across the political spectrum, and that both groups are exposed to cross-cutting content," Bakshy continued.

The researchers studied Facebook users in the United States who publicly listed their political preferences. On average, they found, more than 20% of an individual's Facebook friends who self-identified with a political party were from the opposing party, providing room for exposure to opposite points of view.

When a user opens Facebook, they see a list of recent posts by friends — though not all posts. Facebook's News Feed ranks this content for its users using a special formula, or algorithm, designed to predict their preferences. The algorithm tries to anticipate content that users will like based on criteria such as how often users have clicked on links to certain websites in the News Feed in the past.

Facebook’s News Feed ranking algorithm produced just a one percentage point change in the proportion of news in a feed that challenged users’ beliefs, while individuals’ own choices resulted in a four percentage point decrease in the proportion of challenging content.

The researchers considered the political news that users posted online for friends, noting whether it was liberal or conservative. They also analyzed the content that users ultimately clicked on and consumed.

"One thing that surprised us, in light of the social norms against talking about politics in polite conversation, was just how much cross-cutting content is shared," Bakshy said.

The researchers found, however, that liberals tend to be connected to fewer friends who share conservative content, while conservatives tend to be connected to more friends who share liberal content.

Liberals were also 6% less likely to click on cross-cutting content that appeared in their News Feed (as opposed to content aligned with their beliefs), while conservatives were 17% less likely to do so.

The researchers found that Facebook's News Feed ranking algorithm did ideologically filter what users saw. But, on average, it produced just a one percentage point change in the proportion of news in a feed that challenged users' beliefs, while individuals' own choices of what to click on resulted in a four percentage point decrease in the proportion of challenging content.

The results suggest that users hold the key to challenging their own thoughts and ideas, and that Facebook probably presents individuals with more ideology-challenging ideas than most blogs and news wires.

However, as Facebook's ranking algorithm does have a modest filtering effect, its curation methods should still be regularly assessed, suggests David Lazer, a professor of political science and computer and information science at Northwestern University, in a related Perspective article.

"This is an important finding and one that requires continued vigilance," writes Lazer. "A small effect today might become a large effect tomorrow, depending on changes in the algorithms and human behavior."