TY - JOUR
T1 - The Diffusion and Reach of (Mis)Information on Facebook During the U.S. 2020 Election
AU - González-Bailón, Sandra
AU - Lazer, David
AU - Barberá, Pablo
AU - Godel, William
AU - Allcott, Hunt
AU - Brown, Taylor
AU - Crespo-Tenorio, Adriana
AU - Freelon, Deen
AU - Gentzkow, Matthew
AU - Guess, Andrew M.
AU - Iyengar, Shanto
AU - Kim, Young Mie
AU - Malhotra, Neil
AU - Moehler, Devra
AU - Nyhan, Brendan
AU - Pan, Jennifer
AU - Rivera, Carlos Velasco
AU - Settle, Jaime
AU - Thorson, Emily
AU - Tromble, Rebekah
AU - Wilkins, Arjun
AU - Wojcieszak, Magdalena
AU - de Jonge, Chad Kiewiet
AU - Franco, Annie
AU - Mason, Winter
AU - Stroud, Natalie Jomini
AU - Tucker, Joshua A.
N1 - Publisher Copyright:
© 2024 The Author(s). This open-access article has been published under a Creative Commons Attribution License, which allows unrestricted use, distribution and reproduction, in any form, as long as the original author and source have been credited. c b
PY - 2024
Y1 - 2024
N2 - Social media creates the possibility for rapid, viral spread of content, but how many posts actually reach millions? And is misinformation special in how it propagates? We answer these questions by analyzing the virality of and exposure to information on Facebook during the U.S. 2020 presidential election. We examine the diffusion trees of the approximately 1 B posts that were re-shared at least once by U.S.-based adults from July 1, 2020, to February 1, 2021. We differentiate misinformation from non-misinformation posts to show that (1) misinformation diffused more slowly, relying on a small number of active users that spread misinformation via long chains of peer-to-peer diffusion that reached millions; non-misinformation spread primarily through one-to-many affordances (mainly, Pages); (2) the relative importance of peer-to-peer spread for misinformation was likely due to an enforcement gap in content moderation policies designed to target mostly Pages and Groups; and (3) periods of aggressive content moderation proximate to the election coincide with dramatic drops in the spread and reach of misinformation and (to a lesser extent) political content.
AB - Social media creates the possibility for rapid, viral spread of content, but how many posts actually reach millions? And is misinformation special in how it propagates? We answer these questions by analyzing the virality of and exposure to information on Facebook during the U.S. 2020 presidential election. We examine the diffusion trees of the approximately 1 B posts that were re-shared at least once by U.S.-based adults from July 1, 2020, to February 1, 2021. We differentiate misinformation from non-misinformation posts to show that (1) misinformation diffused more slowly, relying on a small number of active users that spread misinformation via long chains of peer-to-peer diffusion that reached millions; non-misinformation spread primarily through one-to-many affordances (mainly, Pages); (2) the relative importance of peer-to-peer spread for misinformation was likely due to an enforcement gap in content moderation policies designed to target mostly Pages and Groups; and (3) periods of aggressive content moderation proximate to the election coincide with dramatic drops in the spread and reach of misinformation and (to a lesser extent) political content.
KW - content moderation
KW - elections
KW - misinformation
KW - networks
KW - social media
UR - http://www.scopus.com/inward/record.url?scp=85213958624&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85213958624&partnerID=8YFLogxK
U2 - 10.15195/V11.A41
DO - 10.15195/V11.A41
M3 - Article
AN - SCOPUS:85213958624
SN - 2330-6696
VL - 11
SP - 1124
EP - 1146
JO - Sociological Science
JF - Sociological Science
ER -