#FacebookHarmsUs

Facebook promotes lies

Facebook promotes lies

Lies and conspiracy theories are everywhere on Facebook. From Covid-19 to the 2020 U.S. presidential election, mis- and disinformation populates what users see when they log on to the platform.

Facebook Promotes Lies and Conspiracies

In August 2021, a former high-level employee leaked thousands of pages of Facebook documents and internal research to journalists around the world. These damaging records and related research revealed:

  • Lies and conspiracy theories are everywhere on Facebook. From Covid-19 to the 2020 U.S. presidential election, mis- and disinformation populates what users see when they log on to the platform.
  • The company’s algorithms promote this type of false and polarizing content. Facebook’s suggestions and newsfeed rankings increase the visibility of content most likely to receive engagement–even though internal research shows such content is more likely to be false and polarizing.
  • Troll farms on Facebook are common and have extremely large audiences. Facebook’s tools and systems enable these problematic accounts, which publish intentionally inflammatory or false content to manipulate public opinion, to easily expand their reach beyond just those users that follow them.

Lies and conspiracy theories are everywhere on Facebook

The Covid-19 pandemic highlights Facebook’s algorithmic problem with mis- and disinformation. According to an internal Facebook memo, initial testing concluded that roughly 41% of comments on English-language vaccine-related posts risked discouraging people to get vaccinated. Facebook employees also warned that coronavirus misinformation was creating “echo-chamber-like effects”–while other researchers documented how posts by medical authorities, like the World Health Organization, were hijacked by anti-vaccine commenters.

The 2020 U.S. presidential election further illustrates Facebook’s shortcomings on the issue. In an internal report written just after the election, a Facebook data scientist told his co-workers that 10 percent of all U.S. views of political material were of posts that alleged the vote was fraudulent. Within two days of the vote, another company employee had noted that comments with “combustible election misinformation” could be read below many posts. The threat of a conspiracy theory like this swiftly spreading across the platform was apparent sixteen months before the election itself, when a researcher at Facebook described how she started receiving content about conspiracy theories from QAnon within a week of opening an experimental account.

Facebook’s algorithms promote false and polarizing content

The ability of lies and conspiracy theories to dominate on Facebook is made possible by the platform’s underlying algorithms. Facebook’s own research demonstrates how these tools foster disinformation and discord given that the company’s suggestions and newsfeed rankings actively promote content most likely to receive engagement. Even though this type of content is more likely to be false or polarizing, Facebook pushes such inflammatory posts higher and higher up user newsfeeds in an attempt to consistently increase engagement, while also making it available to users who have little or no connection to the original account publishing the content.

Facebook’s employees have consistently raised concerns and demanded action about misinformation and inflammatory content on the platform, yet the company has failed to properly address these issues. A report, written in October 2019 and obtained by MIT Technology Review from a former Facebook employee, found that prior to the 2016 U.S. election, Facebook failed to prioritize fundamental changes to how its platform promotes and distributes information. The company decided instead to “monitor” the activity of bad actors when they engaged in political discourse, and added some guidelines to prevent “the worst of the worst.”

These algorithms incentivize troll farms to expand

Troll farms are able to thrive on Facebook because the platform promotes engaging (read: inflammatory) content and refuses to penalize pages for posting wholly unoriginal content. Due to these algorithmic underpinnings, troll farms–whose strategy is to constantly repost engaging content to manipulate public opinion–consistently see their reach expand out to new audiences. According to one report, some of Facebook’s incredibly popular pages were being run by Eastern European troll farms in the run-up to the 2020 U.S. election. Troublingly, these pages were part of a larger network that reached nearly half of all Americans. Another report showed that troll farms were reaching 140 million US users per month—75% of whom had never followed any of the pages.

Updates

January 4, 2022, Washington Post, Facebook groups swelled with at least 650,000 posts attacking the legitimacy of Joe Biden’s victory.

Resources

REPORT: The disinformation dozen: Why platforms must act on twelve leading online anti-vaxxers, Centre for Countering Digital Hate, Summer 2021 https://www.counterhate.com/disinformationdozen

REPORT: Facebook’s Algorithm: A Major Threat to Public Health, Avaaz, 19th April 2021https://avaazimages.avaaz.org/facebook_threat_health.pdf

REPORT SERIES: Covid-19 Disinformation Briefings, Institute for Strategic Dialogue, 27th March 2020

A series of briefings from ISD’s Digital Research Unit on the information ecosystem around Covid-19. This first briefing compiles research from ISD’s own analysis of online platforms, as well as summarising recent investigations and research on the state of play of disinformation around Covid-19.

No.1 https://www.isdglobal.org/wp-content/uploads/2020/03/COVID-19-Briefing-Institute-for-Strategic-Dialogue-27th-March-2020.pdf

No.2 https://www.isdglobal.org/isd-publications/covid-19-disinformation-briefing-no-2/

No. 3 https://www.isdglobal.org/isd-publications/covid-19-disinformation-briefing-no-3/

No.4 https://www.isdglobal.org/isd-publications/covid-19-disinformation-briefing-no-4/

MENA Monitor https://www.isdglobal.org/isd-publications/mena-monitor-arabic-covid-19-vaccine-misinformation-online/

Check out sources & explore the evidence

Menu