The year 2016 marked a turning point in the history of relations between the Internet, social media, public opinion, and politics. Online practices of grassroots participation, which used to be considered the prerogative of democratizing forces fighting established powers, turned out to be an effective platform for far-right extremists. In the attempt to make sense of what happened and develop workable solutions, stakeholders rapidly moved through the different stages of grief, ranging from denial to anger and acceptance. As a case in point, we trace the path through these stages which moved from an initial denial of the problem to concern over “fake news” and hoaxes, to finally focusing on the behavior of certain actors on the platform. Starting from the current phase, we analyze “coordinated inauthentic behavior”, a concept only briefly defined in Facebook public statements which, nevertheless, is useful to frame future studies insofar as both the idea of coordination and authenticity have been studied in the literature. Leveraging these works, we suggest a definition that is at once framed in this literature and easy to operationalize. Using this definition and an unprecedented combination of CrowdTangle API (a tool for accessing Facebook and other social media data) and two datasets of Italian political news stories published in the run-up to the 2018 Italian general election and 2019 European election, we developed a method that led to the identification of several networks of pages/groups/verified public profiles (“entities”) that shared the same political news stories on Facebook within a very short period of time (10 in 2018, composed of 28 entities, and 50 in 2019, composed of 143 entities). We call this behavior “coordinated link sharing”. By analyzing the social media profiles of such coordinated entities, we observed that while some of them were clearly political, others presented themselves as entertainment venues, despite sharing political content too. Since the political news stories shared by these non-political entities can reach a broad audience which is largely unguarded against attempts to influence, we describe their behavior as “inauthentic”. Further analyses showed that the news shared by the coordinated networks received a Facebook engagement higher than other news included in our dataset, that much news boosted anti-immigration and far-right propaganda and that several of the news outlets shared by these networks were already blacklisted by fact-checkers.

Understanding Coordinated and Inauthentic Link Sharing Behavior on Facebook in the Run-up to 2018 General Election and 2019 European Election in Italy

Giglietto, Fabio
;
Righetti, Nicola;Marino, Giada
2019-01-01

Abstract

The year 2016 marked a turning point in the history of relations between the Internet, social media, public opinion, and politics. Online practices of grassroots participation, which used to be considered the prerogative of democratizing forces fighting established powers, turned out to be an effective platform for far-right extremists. In the attempt to make sense of what happened and develop workable solutions, stakeholders rapidly moved through the different stages of grief, ranging from denial to anger and acceptance. As a case in point, we trace the path through these stages which moved from an initial denial of the problem to concern over “fake news” and hoaxes, to finally focusing on the behavior of certain actors on the platform. Starting from the current phase, we analyze “coordinated inauthentic behavior”, a concept only briefly defined in Facebook public statements which, nevertheless, is useful to frame future studies insofar as both the idea of coordination and authenticity have been studied in the literature. Leveraging these works, we suggest a definition that is at once framed in this literature and easy to operationalize. Using this definition and an unprecedented combination of CrowdTangle API (a tool for accessing Facebook and other social media data) and two datasets of Italian political news stories published in the run-up to the 2018 Italian general election and 2019 European election, we developed a method that led to the identification of several networks of pages/groups/verified public profiles (“entities”) that shared the same political news stories on Facebook within a very short period of time (10 in 2018, composed of 28 entities, and 50 in 2019, composed of 143 entities). We call this behavior “coordinated link sharing”. By analyzing the social media profiles of such coordinated entities, we observed that while some of them were clearly political, others presented themselves as entertainment venues, despite sharing political content too. Since the political news stories shared by these non-political entities can reach a broad audience which is largely unguarded against attempts to influence, we describe their behavior as “inauthentic”. Further analyses showed that the news shared by the coordinated networks received a Facebook engagement higher than other news included in our dataset, that much news boosted anti-immigration and far-right propaganda and that several of the news outlets shared by these networks were already blacklisted by fact-checkers.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11576/2671453
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact