The usefulness of Facebook and other social media for accurately disseminating information has been a hot topic for discussion for some time. There is also an ongoing debate about the credibility of fast-growing news sources on Facebook and similar social media platforms. These platforms rely on algorithms to choose and display information they deem appropriate to their users, based on their individual behavior. A new study — again — supports critics’ longstanding argument that social media algorithms fuel the spread of misinformation about more reliable sources, resulting in misleading public debate on an important topic.
According to a report by The Washington Post, researchers from New York University and the Université Grenoble Alpes in France studied user behavior on Facebook around the 2020 US presidential election and found that between August 2020 and January 2021, news publishers known for releasing of misinformation, six times more “likes, shares and interactions” on the platform than reliable news sources, such as the NewsMadura or the World Health Organization (WHO).
The researchers also found that disinformation-trafficking pages could appeal to both the far-left and far-right Facebook users much more than actual pages. The findings confirm concerns about “fake news” that first became public after the 2016 US presidential election, which was held after a divisive and bitter polling campaign. Social media is often blamed for amplifying calls to violence, including the Jan. 6 attempt by Trump supporters to violently storm the Capitol, the seat of the US government’s legislature.
When he appeared before Congress two months later, Facebook CEO Mark Zuckerberg appeared to suggest he bore no responsibility for the disinformation campaign that ran on the social media platform. Others who attended the hearing included Twitter CEO Jack Dorsey and Google parent Alphabet CEO Sundar Pichai. Lawmakers condemned the three platforms’ approach to fake content.
Rebekah Tromble, director of the Institute for Data, Democracy and Politics at George Washington University, which reviewed the study’s findings, said it adds to the growing body of evidence that “disinformation has found a comfortable home” on Facebook despite a number of mitigation efforts.
Facebook said the study measures the number of people engaging with content, but that’s not a measure of how many people actually view it. “If you look at the content that gets the most reach on Facebook, it’s not at all what this study suggests,” a Facebook spokesperson told The Washington Post. Facebook does not make the number of people viewing content on its platform (impressions) publicly available to researchers.