Daron Acemoğlu, Asuman Ozdaglar, James Siderius, 30 June 2021

Misinformation spreads rapidly on social media platforms. This column uses a model of online content-sharing to show that a social media platform that wishes to maximise content engagement will propagate extreme articles amongst its most extremist users. ‘Filter bubbles’ prevent the content from spreading beyond its extremist demographic, creating ‘echo chambers’ in which misinformation circulates. The threat of censorship and a corresponding loss in engagement could pressure platforms to fact-check themselves, while regulating their algorithms could mitigate the consequences of filter bubbles. 

Sagit Bar-Gill, Neil Gandal, 10 April 2017

Online echo chambers – in which people engage only with others that share, and media that reflect, their opinions and biases – have become an area of concern in the wake of last year’s startling political upsets. This column investigates how users navigate and explore an online content space. Highly social users and younger users are most likely to get caught in echo chambers, while opinion leaders are less likely to get caught. Reducing the visibility of content popularity information, such as ‘like’ and ‘view’ counts, may help mitigate echo chamber effects. 

Events

CEPR Policy Research