Content Moderation


In 2018, I spent a year as part of a project involving researchers at Harvard, Stanford, and Oxford looking at how content policy is enacted at Facebook. While plenty of important work in the past few years has looked at the practice of ‘moderation’ (the removal of content by a global network of contractors), less work had at the time examined the employees and processes which create the rules that those moderators then work to enforce. With Timothy Garton Ash, we examined and critiqued the updated ‘Community Guidelines’ before they were published in the Spring of 2018, spent some time ‘under-the-hood’ trying to better understand new processes like Facebook’s ‘Content Standards Forum,’ and published an Oxford-Stanford report based on our interviews.


. Democratic Transparency in the Platform Society. In Social Media and Democracy: The State of the Field, edited by Nate Persily and Josh Tucker. New York, NY: Cambridge University Press, 2019.

Project SocArxiv

. Glasnost? Nine ways Facebook can make itself a better forum for free speech and democracy. Oxford, UK: Reuters Institute for the Study of Journalism, 2019.

Project Report