Alessandro Bessi
University of Southern California
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alessandro Bessi.
Proceedings of the National Academy of Sciences of the United States of America | 2016
Michela Del Vicario; Alessandro Bessi; Fabiana Zollo; Fabio Petroni; Antonio Scala; Guido Caldarelli; H. Eugene Stanley; Walter Quattrociocchi
Significance The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web is a fruitful environment for the massive diffusion of unverified rumors. In this work, using a massive quantitative analysis of Facebook, we show that information related to distinct narratives––conspiracy theories and scientific news––generates homogeneous and polarized communities (i.e., echo chambers) having similar information consumption patterns. Then, we derive a data-driven percolation model of rumor spreading that demonstrates that homogeneity and polarization are the main determinants for predicting cascades’ size. The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web (WWW) also allows for the rapid dissemination of unsubstantiated rumors and conspiracy theories that often elicit rapid, large, but naive social responses such as the recent case of Jade Helm 15––where a simple military exercise turned out to be perceived as the beginning of a new civil war in the United States. In this work, we address the determinants governing misinformation spreading through a thorough quantitative analysis. In particular, we focus on how Facebook users consume information related to two distinct narratives: scientific and conspiracy news. We find that, although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, cascade dynamics differ. Selective exposure to content is the primary driver of content diffusion and generates the formation of homogeneous clusters, i.e., “echo chambers.” Indeed, homogeneity appears to be the primary driver for the diffusion of contents and each echo chamber has its own cascade dynamics. Finally, we introduce a data-driven percolation model mimicking rumor spreading and we show that homogeneity and polarization are the main determinants for predicting cascades’ size.
PLOS ONE | 2015
Alessandro Bessi; Mauro Coletto; George Alexandru Davidescu; Antonio Scala; Guido Caldarelli; Walter Quattrociocchi
The large availability of user provided contents on online social media facilitates people aggregation around shared beliefs, interests, worldviews and narratives. In spite of the enthusiastic rhetoric about the so called collective intelligence unsubstantiated rumors and conspiracy theories—e.g., chemtrails, reptilians or the Illuminati—are pervasive in online social networks (OSN). In this work we study, on a sample of 1.2 million of individuals, how information related to very distinct narratives—i.e. main stream scientific and conspiracy news—are consumed and shape communities on Facebook. Our results show that polarized communities emerge around distinct types of contents and usual consumers of conspiracy news result to be more focused and self-contained on their specific contents. To test potential biases induced by the continued exposure to unsubstantiated rumors on users’ content selection, we conclude our analysis measuring how users respond to 4,709 troll information—i.e. parodistic and sarcastic imitation of conspiracy theories. We find that 77.92% of likes and 80.86% of comments are from users usually interacting with conspiracy stories.
PLOS ONE | 2015
Fabiana Zollo; Petra Kralj Novak; Michela Del Vicario; Alessandro Bessi; Igor Mozetič; Antonio Scala; Guido Caldarelli; Walter Quattrociocchi
According to the World Economic Forum, the diffusion of unsubstantiated rumors on online social media is one of the main threats for our society. The disintermediated paradigm of content production and consumption on online social media might foster the formation of homogeneous communities (echo-chambers) around specific worldviews. Such a scenario has been shown to be a vivid environment for the diffusion of false claim. Not rarely, viral phenomena trigger naive (and funny) social responses—e.g., the recent case of Jade Helm 15 where a simple military exercise turned out to be perceived as the beginning of the civil war in the US. In this work, we address the emotional dynamics of collective debates around distinct kinds of information—i.e., science and conspiracy news—and inside and across their respective polarized communities. We find that for both kinds of content the longer the discussion the more the negativity of the sentiment. We show that comments on conspiracy posts tend to be more negative than on science posts. However, the more the engagement of users, the more they tend to negative commenting (both on science and conspiracy). Finally, zooming in at the interaction among polarized communities, we find a general negative pattern. As the number of comments increases—i.e., the discussion becomes longer—the sentiment of the post is more and more negative.
PLOS ONE | 2017
Fabiana Zollo; Alessandro Bessi; Michela Del Vicario; Antonio Scala; Guido Caldarelli; Louis M. Shekhtman; Shlomo Havlin; Walter Quattrociocchi
Social media aggregate people around common interests eliciting collective framing of narratives and worldviews. However, in such a disintermediated environment misinformation is pervasive and attempts to debunk are often undertaken to contrast this trend. In this work, we examine the effectiveness of debunking on Facebook through a quantitative analysis of 54 million users over a time span of five years (Jan 2010, Dec 2014). In particular, we compare how users usually consuming proven (scientific) and unsubstantiated (conspiracy-like) information on Facebook US interact with specific debunking posts. Our findings confirm the existence of echo chambers where users interact primarily with either conspiracy-like or scientific pages. However, both groups interact similarly with the information within their echo chamber. Then, we measure how users from both echo chambers interacted with 50,220 debunking posts accounting for both users consumption patterns and the sentiment expressed in their comments. Sentiment analysis reveals a dominant negativity in the comments to debunking posts. Furthermore, such posts remain mainly confined to the scientific echo chamber. Only few conspiracy users engage with corrections and their liking and commenting rates on conspiracy posts increases after the interaction.
Journal of Trust Management | 2014
Alessandro Bessi; Antonio Scala; Luca Rossi; Qian Zhang; Walter Quattrociocchi
In this work we present a thorough quantitative analysis of information consumption patterns of qualitatively different information on Facebook. Pages are categorized, according to their topics and the communities of interests they pertain to, in a) alternative information sources (diffusing topics that are neglected by science and main stream media); b) online political activism; and c) main stream media. We find similar information consumption patterns despite the very different nature of contents. Then, we classify users according to their interaction patterns among the different topics and measure how they responded to the injection of 2788 false information (parodistic imitations of alternative stories). We find that users prominently interacting with alternative information sources – i.e. more exposed to unsubstantiated claims – are more prone to interact with intentional and parodistic false claims.
PLOS ONE | 2015
Alessandro Bessi; Fabiana Zollo; Michela Del Vicario; Antonio Scala; Guido Caldarelli; Walter Quattrociocchi
Social media enabled a direct path from producer to consumer of contents changing the way users get informed, debate, and shape their worldviews. Such a disintermediation might weaken consensus on social relevant issues in favor of rumors, mistrust, or conspiracy thinking—e.g., chem-trails inducing global warming, the link between vaccines and autism, or the New World Order conspiracy. Previous studies pointed out that consumers of conspiracy-like content are likely to aggregate in homophile clusters—i.e., echo-chambers. Along this path we study, by means of a thorough quantitative analysis, how different topics are consumed inside the conspiracy echo-chamber in the Italian Facebook. Through a semi-automatic topic extraction strategy, we show that the most consumed contents semantically refer to four specific categories: environment, diet, health, and geopolitics. We find similar consumption patterns by comparing users activity (likes and comments) on posts belonging to these different semantic categories. Finally, we model users mobility across the distinct topics finding that the more a user is active, the more he is likely to span on all categories. Once inside a conspiracy narrative users tend to embrace the overall corpus.
social informatics | 2014
Alessandro Bessi; Guido Caldarelli; Michela Del Vicario; Antonio Scala; Walter Quattrociocchi
Despite the enthusiastic rhetoric about the so called collective intelligence, conspiracy theories – e.g. global warming induced by chemtrails or the link between vaccines and autism – find on the Web a natural medium for their dissemination. Users preferentially consume information according to their system of beliefs and the strife within users of opposite worldviews (e.g., scientific and conspiracist) may result in heated debates. In this work we provide a genuine example of information consumption on a set of 1.2 million of Facebook Italian users. We show by means of a thorough quantitative analysis that information supporting different worldviews – i.e. scientific and conspiracist news – are consumed in a comparable way. Moreover, we measure the effect of 4709 evidently false information (satirical version of conspiracist stories) and 4502 debunking memes (information aiming at contrasting unsubstantiated rumors) on polarized users of conspiracy claims.
PLOS ONE | 2016
Alessandro Bessi; Fabiana Zollo; Michela Del Vicario; Michelangelo Puliga; Antonio Scala; Guido Caldarelli; Brian Uzzi; Walter Quattrociocchi
Users online tend to select information that support and adhere their beliefs, and to form polarized groups sharing the same view—e.g. echo chambers. Algorithms for content promotion may favour this phenomenon, by accounting for users preferences and thus limiting the exposure to unsolicited contents. To shade light on this question, we perform a comparative study on how same contents (videos) are consumed on different online social media—i.e. Facebook and YouTube—over a sample of 12M of users. Our findings show that content drives the emergence of echo chambers on both platforms. Moreover, we show that the users’ commenting patterns are accurate predictors for the formation of echo-chambers.
Scientific Reports | 2016
Michela Del Vicario; Gianna Vivaldo; Alessandro Bessi; Fabiana Zollo; Antonio Scala; Guido Caldarelli; Walter Quattrociocchi
Recent findings showed that users on Facebook tend to select information that adhere to their system of beliefs and to form polarized groups – i.e., echo chambers. Such a tendency dominates information cascades and might affect public debates on social relevant issues. In this work we explore the structural evolution of communities of interest by accounting for users emotions and engagement. Focusing on the Facebook pages reporting on scientific and conspiracy content, we characterize the evolution of the size of the two communities by fitting daily resolution data with three growth models – i.e. the Gompertz model, the Logistic model, and the Log-logistic model. Although all the models appropriately describe the data structure, the Logistic one shows the best fit. Then, we explore the interplay between emotional state and engagement of users in the group dynamics. Our findings show that communities’ emotional behavior is affected by the users’ involvement inside the echo chamber. Indeed, to an higher involvement corresponds a more negative approach. Moreover, we observe that, on average, more active users show a faster shift towards the negativity than less active ones.
Information-an International Interdisciplinary Journal | 2018
Anna Sapienza; Alessandro Bessi; Emilio Ferrara
Multiplayer online battle arena is a genre of online games that has become extremely popular. Due to their success, these games also drew the attention of our research community, because they provide a wealth of information about human online interactions and behaviors. A crucial problem is the extraction of activity patterns that characterize this type of data, in an interpretable way. Here, we leverage the Non-negative Tensor Factorization to detect hidden correlated behaviors of playing in a well-known game: League of Legends. To this aim, we collect the entire gaming history of a group of about 1000 players, which accounts for roughly 100K matches. By applying our framework we are able to separate players into different groups. We show that each group exhibits similar features and playing strategies, as well as similar temporal trajectories, i.e., behavioral progressions over the course of their gaming history. We surprisingly discover that playing strategies are stable over time and we provide an explanation for this observation.