Since the relatively early days of the Internet era, there have been concerns that people would self-select what kind of news content to consume, and then repeatedly reinforce their own beliefs.
Now researchers for Facebook have published a peer-reviewed study in the journal Science finding that Facebook itself actually isn’t the only reason, or even the biggest one, that people choose to consume information they already agree with (on Facebook).
On average, about 23 percent of users’ friends have an opposing political affiliation, according to the study. An average of nearly 29 percent of the news stories displayed by Facebook’s News Feed also appear to present views that conflict with the user’s own ideology.
The researchers say that individuals’ choices about which stories to click have a larger effect than Facebook’s filtering mechanism in determining whether people encounter news that conflicts with their professed ideology.
With more than 1.3 billion users, Facebook is effectively the world’s most widely read daily newspaper. About 30 percent of U.S. adults get their news from Facebook, according to the Pew Research Center.
But Facebook’s editorial decisions are drafted with little transparency using the News Feed algorithm. Facebook could use the study’s results to show that the algorithm isn’t ruining national discourse.