Michela Del Vicario
IMT Institute for Advanced Studies Lucca
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Michela Del Vicario.
Proceedings of the National Academy of Sciences of the United States of America | 2016
Michela Del Vicario; Alessandro Bessi; Fabiana Zollo; Fabio Petroni; Antonio Scala; Guido Caldarelli; H. Eugene Stanley; Walter Quattrociocchi
Significance The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web is a fruitful environment for the massive diffusion of unverified rumors. In this work, using a massive quantitative analysis of Facebook, we show that information related to distinct narratives––conspiracy theories and scientific news––generates homogeneous and polarized communities (i.e., echo chambers) having similar information consumption patterns. Then, we derive a data-driven percolation model of rumor spreading that demonstrates that homogeneity and polarization are the main determinants for predicting cascades’ size. The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web (WWW) also allows for the rapid dissemination of unsubstantiated rumors and conspiracy theories that often elicit rapid, large, but naive social responses such as the recent case of Jade Helm 15––where a simple military exercise turned out to be perceived as the beginning of a new civil war in the United States. In this work, we address the determinants governing misinformation spreading through a thorough quantitative analysis. In particular, we focus on how Facebook users consume information related to two distinct narratives: scientific and conspiracy news. We find that, although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, cascade dynamics differ. Selective exposure to content is the primary driver of content diffusion and generates the formation of homogeneous clusters, i.e., “echo chambers.” Indeed, homogeneity appears to be the primary driver for the diffusion of contents and each echo chamber has its own cascade dynamics. Finally, we introduce a data-driven percolation model mimicking rumor spreading and we show that homogeneity and polarization are the main determinants for predicting cascades’ size.
PLOS ONE | 2015
Fabiana Zollo; Petra Kralj Novak; Michela Del Vicario; Alessandro Bessi; Igor Mozetič; Antonio Scala; Guido Caldarelli; Walter Quattrociocchi
According to the World Economic Forum, the diffusion of unsubstantiated rumors on online social media is one of the main threats for our society. The disintermediated paradigm of content production and consumption on online social media might foster the formation of homogeneous communities (echo-chambers) around specific worldviews. Such a scenario has been shown to be a vivid environment for the diffusion of false claim. Not rarely, viral phenomena trigger naive (and funny) social responses—e.g., the recent case of Jade Helm 15 where a simple military exercise turned out to be perceived as the beginning of the civil war in the US. In this work, we address the emotional dynamics of collective debates around distinct kinds of information—i.e., science and conspiracy news—and inside and across their respective polarized communities. We find that for both kinds of content the longer the discussion the more the negativity of the sentiment. We show that comments on conspiracy posts tend to be more negative than on science posts. However, the more the engagement of users, the more they tend to negative commenting (both on science and conspiracy). Finally, zooming in at the interaction among polarized communities, we find a general negative pattern. As the number of comments increases—i.e., the discussion becomes longer—the sentiment of the post is more and more negative.
PLOS ONE | 2017
Fabiana Zollo; Alessandro Bessi; Michela Del Vicario; Antonio Scala; Guido Caldarelli; Louis M. Shekhtman; Shlomo Havlin; Walter Quattrociocchi
Social media aggregate people around common interests eliciting collective framing of narratives and worldviews. However, in such a disintermediated environment misinformation is pervasive and attempts to debunk are often undertaken to contrast this trend. In this work, we examine the effectiveness of debunking on Facebook through a quantitative analysis of 54 million users over a time span of five years (Jan 2010, Dec 2014). In particular, we compare how users usually consuming proven (scientific) and unsubstantiated (conspiracy-like) information on Facebook US interact with specific debunking posts. Our findings confirm the existence of echo chambers where users interact primarily with either conspiracy-like or scientific pages. However, both groups interact similarly with the information within their echo chamber. Then, we measure how users from both echo chambers interacted with 50,220 debunking posts accounting for both users consumption patterns and the sentiment expressed in their comments. Sentiment analysis reveals a dominant negativity in the comments to debunking posts. Furthermore, such posts remain mainly confined to the scientific echo chamber. Only few conspiracy users engage with corrections and their liking and commenting rates on conspiracy posts increases after the interaction.
PLOS ONE | 2015
Alessandro Bessi; Fabiana Zollo; Michela Del Vicario; Antonio Scala; Guido Caldarelli; Walter Quattrociocchi
Social media enabled a direct path from producer to consumer of contents changing the way users get informed, debate, and shape their worldviews. Such a disintermediation might weaken consensus on social relevant issues in favor of rumors, mistrust, or conspiracy thinking—e.g., chem-trails inducing global warming, the link between vaccines and autism, or the New World Order conspiracy. Previous studies pointed out that consumers of conspiracy-like content are likely to aggregate in homophile clusters—i.e., echo-chambers. Along this path we study, by means of a thorough quantitative analysis, how different topics are consumed inside the conspiracy echo-chamber in the Italian Facebook. Through a semi-automatic topic extraction strategy, we show that the most consumed contents semantically refer to four specific categories: environment, diet, health, and geopolitics. We find similar consumption patterns by comparing users activity (likes and comments) on posts belonging to these different semantic categories. Finally, we model users mobility across the distinct topics finding that the more a user is active, the more he is likely to span on all categories. Once inside a conspiracy narrative users tend to embrace the overall corpus.
social informatics | 2014
Alessandro Bessi; Guido Caldarelli; Michela Del Vicario; Antonio Scala; Walter Quattrociocchi
Despite the enthusiastic rhetoric about the so called collective intelligence, conspiracy theories – e.g. global warming induced by chemtrails or the link between vaccines and autism – find on the Web a natural medium for their dissemination. Users preferentially consume information according to their system of beliefs and the strife within users of opposite worldviews (e.g., scientific and conspiracist) may result in heated debates. In this work we provide a genuine example of information consumption on a set of 1.2 million of Facebook Italian users. We show by means of a thorough quantitative analysis that information supporting different worldviews – i.e. scientific and conspiracist news – are consumed in a comparable way. Moreover, we measure the effect of 4709 evidently false information (satirical version of conspiracist stories) and 4502 debunking memes (information aiming at contrasting unsubstantiated rumors) on polarized users of conspiracy claims.
Social Networks | 2017
Michela Del Vicario; Fabiana Zollo; Guido Caldarelli; Antonio Scala; Walter Quattrociocchi
Abstract Nowadays users get informed and shape their opinion through social media. However, the disintermediated access to contents does not guarantee quality of information. Selective exposure and confirmation bias, indeed, have been shown to play a pivotal role in content consumption and information spreading. Users tend to select information adhering (and reinforcing) their worldview and to ignore dissenting information. This pattern elicits the formation of polarized groups – i.e., echo chambers – where the interaction with like-minded people might even reinforce polarization. In this work we address news consumption around Brexit in UK on Facebook. In particular, we perform a massive analysis on more than 1 million users interacting with Brexit related posts from the main news providers between January and July 2016. We show that consumption patterns elicit the emergence of two distinct communities of news outlets. Furthermore, to better characterize inner group dynamics, we introduce a new technique which combines automatic topic extraction and sentiment analysis. We compare how the same topics are presented on posts and the related emotional response on comments finding significant differences in both echo chambers and that polarization influences the perception of topics. Our results provide important insights about the determinants of polarization and evolution of core narratives on online debating.
PLOS ONE | 2016
Alessandro Bessi; Fabiana Zollo; Michela Del Vicario; Michelangelo Puliga; Antonio Scala; Guido Caldarelli; Brian Uzzi; Walter Quattrociocchi
Users online tend to select information that support and adhere their beliefs, and to form polarized groups sharing the same view—e.g. echo chambers. Algorithms for content promotion may favour this phenomenon, by accounting for users preferences and thus limiting the exposure to unsolicited contents. To shade light on this question, we perform a comparative study on how same contents (videos) are consumed on different online social media—i.e. Facebook and YouTube—over a sample of 12M of users. Our findings show that content drives the emergence of echo chambers on both platforms. Moreover, we show that the users’ commenting patterns are accurate predictors for the formation of echo-chambers.
Scientific Reports | 2016
Michela Del Vicario; Gianna Vivaldo; Alessandro Bessi; Fabiana Zollo; Antonio Scala; Guido Caldarelli; Walter Quattrociocchi
Recent findings showed that users on Facebook tend to select information that adhere to their system of beliefs and to form polarized groups – i.e., echo chambers. Such a tendency dominates information cascades and might affect public debates on social relevant issues. In this work we explore the structural evolution of communities of interest by accounting for users emotions and engagement. Focusing on the Facebook pages reporting on scientific and conspiracy content, we characterize the evolution of the size of the two communities by fitting daily resolution data with three growth models – i.e. the Gompertz model, the Logistic model, and the Log-logistic model. Although all the models appropriately describe the data structure, the Logistic one shows the best fit. Then, we explore the interplay between emotional state and engagement of users in the group dynamics. Our findings show that communities’ emotional behavior is affected by the users’ involvement inside the echo chamber. Indeed, to an higher involvement corresponds a more negative approach. Moreover, we observe that, on average, more active users show a faster shift towards the negativity than less active ones.
Scientific Reports | 2017
Michela Del Vicario; Antonio Scala; Guido Caldarelli; H. Eugene Stanley; Walter Quattrociocchi
Online users tend to select claims that adhere to their system of beliefs and to ignore dissenting information. Confirmation bias, indeed, plays a pivotal role in viral phenomena. Furthermore, the wide availability of content on the web fosters the aggregation of likeminded people where debates tend to enforce group polarization. Such a configuration might alter the public debate and thus the formation of the public opinion. In this paper we provide a mathematical model to study online social debates and the related polarization dynamics. We assume the basic updating rule of the Bounded Confidence Model (BCM) and we develop two variations a) the Rewire with Bounded Confidence Model (RBCM), in which discordant links are broken until convergence is reached; and b) the Unbounded Confidence Model, under which the interaction among discordant pairs of users is allowed even with a negative feedback, either with the rewiring step (RUCM) or without it (UCM). From numerical simulations we find that the new models (UCM and RUCM), unlike the BCM, are able to explain the coexistence of two stable final opinions, often observed in reality. Lastly, we present a mean field approximation of the newly introduced models.
arXiv: Social and Information Networks | 2015
Alessandro Bessi; Fabiana Zollo; Michela Del Vicario; Antonio Scala; Fabio Petroni; Bruno Gonçcalves; Walter Quattrociocchi
Facebook is flooded by diverse and heterogeneous content, from kittens up to music and news, passing through satirical and funny stories. Each piece of that corpus reflects the heterogeneity of the underlying social background. In the Italian Facebook we have found an interesting case: a page having more than