Daron Acemoğlu, Asuman Ozdaglar, James Siderius, 30 June 2021

Misinformation spreads rapidly on social media platforms. This column uses a model of online content-sharing to show that a social media platform that wishes to maximise content engagement will propagate extreme articles amongst its most extremist users. ‘Filter bubbles’ prevent the content from spreading beyond its extremist demographic, creating ‘echo chambers’ in which misinformation circulates. The threat of censorship and a corresponding loss in engagement could pressure platforms to fact-check themselves, while regulating their algorithms could mitigate the consequences of filter bubbles. 

Nicolas Ajzenman, Tiago Cavalcanti, Daniel Da Mata, 02 May 2020

Regardless of their scientific soundness, COVID-19 recommendations from political leaders such as President Trump are taken seriously by followers. In Brazil, President Bolsonaro has publicly flaunted social distancing measures and downplayed the seriousness of the disease in at least two well-publicised instances. This column analyses the effects of Bolsonaro’s actions and speeches in the month of March on Brazilians’ social-distancing behaviours, using electoral data and geo-localised mobile phone data from 60 million devices. The findings suggest that social distancing behaviour decreased in municipalities with stronger support for Bolsonaro.

Anke Kessler, Tom Cornwall, 06 June 2012

Does misinformation demobilise the electorate? Measuring the impact of alleged ‘robocalls’ on voter turnout in the 2011 Canadian federal election.

Events

CEPR Policy Research