Despite widespread concern over the role played by disinformation during recent electoral processes, the intrinsic elusiveness of the subject hinders efforts aimed at estimating its prevalence and effect. While there has been proliferation of attempts to define, understand and fight the spread of problematic information in contemporary media ecosystems, most of these attempts focus on detecting false content and/or bad actors. For instance, several existing studies rely on lists of problematic content or news media sources compiled by fact-checkers. However, these lists may quickly become obsolete leading to unreliable estimates. Using media manipulation as a frame, along with a revised version of the “coordinated inauthentic behavior” original definition, in this paper, we argue for a wider ecological focus. Leveraging a method designed to detect “coordinated links sharing behavior” (CLSB), we introduce and assess an approach aimed at creating and keeping lists of potentially problematic sources updated by analyzing the URLs shared on Facebook by public groups, pages, and verified profiles. We show how CLSB is consistently associated with higher risks of encountering problematic news sources across three different datasets of news stories and can be thus used as a signal to support manual and automatic detection of problematic information.

Coordinated Link Sharing Behavior as a Signal to Surface Sources of Problematic Information on Facebook

Giglietto, Fabio
;
Righetti, Nicola;Marino, Giada
2020

Abstract

Despite widespread concern over the role played by disinformation during recent electoral processes, the intrinsic elusiveness of the subject hinders efforts aimed at estimating its prevalence and effect. While there has been proliferation of attempts to define, understand and fight the spread of problematic information in contemporary media ecosystems, most of these attempts focus on detecting false content and/or bad actors. For instance, several existing studies rely on lists of problematic content or news media sources compiled by fact-checkers. However, these lists may quickly become obsolete leading to unreliable estimates. Using media manipulation as a frame, along with a revised version of the “coordinated inauthentic behavior” original definition, in this paper, we argue for a wider ecological focus. Leveraging a method designed to detect “coordinated links sharing behavior” (CLSB), we introduce and assess an approach aimed at creating and keeping lists of potentially problematic sources updated by analyzing the URLs shared on Facebook by public groups, pages, and verified profiles. We show how CLSB is consistently associated with higher risks of encountering problematic news sources across three different datasets of news stories and can be thus used as a signal to support manual and automatic detection of problematic information.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11576/2677666
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 16
  • ???jsp.display-item.citation.isi??? 10
social impact