Non perdete questo interessante e attuale Seminario, tenuto da Walter Quattrociocchi, in Aula R2, venerdì 19 febbraio. Per tutti coloro che vogliono sapere di più sulle “bufale” che girano sui Social Networks, ecco il Seminario giusto! Qui di seguito, l’abstract del Workshop…

How does (mis)information spread online
The increasing growth of knowledge, fostered by the Internet and the unprecedented acceleration of scientific and technological progress, has exposed society to escalating levels of complexity of information to explain reality and its phenomena. Meanwhile, a shift of paradigm in the production and fruition of contents occurred, a shift away from somewhat centralized production and where now users themselves have increasing control on and over the flow of information. On the Web people can produce and access a variety of information actively participating in the creation and diffusion of narratives. But what about the quality of information on which such narratives are grounded? Is more information, by definition, always better? In 2013 the World Economic Forum listed massive digital misinformation as one of the main risks for the functioning of modern society. This phenomenon presents a very interesting ground for interdisciplinary research as it involves the cognitive dimension of users when are faced with the massive amount of information online and can be investigated by means of data science tools. Trough a thorough massive quantitative analysis we provided tight insights about the pivotal role of confirmation bias in the emergence of echo chambers. Focusing on the Italian Facebook, we show that online unsubstantiated rumors (e.g. the link between vaccines and autism, the global warming induced by chemtrails, the secret alien government) reverberate in a comparable way with respect to mainstream information such as scientific news and updates. In another work we outlined that massive digital misinformation permeates online social dynamics creating viral phenomena even on intentional parodistic false information and pointed out that communities on online social media are aggregated around shared narratives. We find that, although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, cascade dynamics differ. Selective exposure to content is the primary driver of content diffusion and generates the formation of homogeneous clusters, i.e., “echo chambers.” Indeed, homogeneity appears to be the primary driver for the diffusion of contents and each echo chamber has its own cascade dynamics. Whether a news item, either substantiated or not, is accepted as true by a user may be strongly affected by social norms or by how much it coheres with the user’s system of beliefs. We analyzed the emotional dynamics inside and across the echo chambers finding that the sentiment tend to become negative as a function of the length of the discussion. We performed further analysis on the US Facebook where we examined the effectiveness of debunking through a quantitative analysis of million users over a time span of five years (Jan 2010, Dec 2014). Our results confirm the existence of echo chambers where users interact primarily with either conspiracy-like or scientific pages. However, both groups interact similarly with the information within their echo chamber. We examine the users’ response to 50,000 debunking posts finding that attempts at debunking are largely ineffective. Debunking efforts serve to reinforce the beliefs of people already in the science echo chamber rather than convince people in the conspiracy echo chamber to change their views. Indeed, only a small fraction of usual consumers of unsubstantiated information interact with the posts. Furthermore, we show that those few are often the most committed conspiracy users and rather than internalizing debunking information, they often react to it negatively. After interacting with debunking posts, users retain, or even increase, their engagement within the conspiracy echo chamber. Last but not least, by comparing content consumption about the same exact content on Facebook and Youtube, we show that polarization and the formation of echo chambers in independent of the platform. Our findings served to inform the global risk report 2016 of the World Economic Forum.bufale

By |febbraio 12th, 2016|Uncategorized|0 Comments

About the Author:

Comments are closed.