Invitation – Research Seminar "Biased and inaccurate information on social networks"

Research Cluster

Speaker:Dr Mohsen Mosleh,
Date: Monday 22 February 2021
Time: 16:00 - 17:00
Location: Zoom

Further details

Bio: Mohsen Mosleh is a Lecturer (Assistant Professor) at University of Exeter Business School and a Research Affiliate at MIT. Mohsen has been a postdoctoral fellow in the Human Cooperation Lab at the MIT Sloan School of Management and the Department of Psychology at Yale University. Prior to his post-doctoral studies, Mohsen received his PhD from Stevens Institute of Technology in Systems Engineering with a minor in data science. He has five years of prior industry experience as a Software & Systems Integration Lead. Mohsen’s research interests lie at the intersection of computational/data science and cognitive/social science. In particular, he studies how information and misinformation spread on social media, collective decision-making, and cooperation. Website: Title: Biased and inaccurate information on social networks Abstract: There has been a great deal of concern about the negative impacts of social media on democracy and society. In this talk, I provide an overview of my research on two ways in which social networks can negatively affect public discourse. First, I will discuss studies examining the spread of misinformation on Twitter. I begin by describing a hybrid lab-field study in which roughly 2000 Twitter users complete a cognitive survey. We find that people who rely on intuitive gut responses over analytical thinking share lower quality content. I then build on this observation with a Twitter field experiment that uses a subtle intervention to nudge people to think about accuracy. We messaged over 5000 Twitter users who had previously shared links to misinformation sites, and asked them to rate the accuracy of a single non-political headline - therefore making the concept of accuracy more top-of-mind for them. The message significantly improved the quality of the news sources they shared subsequently. I will also describe ongoing project on effect fact-check delivery on social media. Our experimental design translates directly into an intervention that social media companies could deploy to fight misinformation online. Next, I will discuss a set of large social network experiments in which n = 2,520 human subjects play a voter game on an artificial social network I constructed. These experiments demonstrate how, even in the absence of misinformation, decisions can be heavily biased by the structure of the network via a phenomenon we call “information gerrymandering”. I will also describe ongoing empirical work on preferential social formation. Together, these projects provide an integrative view of human cognition and decision making on (social) networks.