This conference explores how online platforms create policy to moderate and remove user content posted on their sites, how they operationalize those policies, and how these policies affect the culture of online speech for individuals and new media.
Panels will include policy representatives from top platforms, former moderators speaking to the history of how content moderation policies were created, simulation exercises for the audience, and a keynote panel on content moderation and free speech with Ben Smith (Editor-in-Chief of Buzzfeed) and Josh Marshall (Founder, Editor, and Publisher of Talking Points Memo) moderated by Jack Balkin (Professor of Law at Yale Law School).
Coffee, light breakfast, box lunch, and reception after the event will be provided.
The conference is grateful to the following organizations for their generous sponsorship of this event:
Tentative working schedule for the conference below, please check back for updates
8:30 am Breakfast and Registration
9:20 am Opening Remarks
9:30-10:45 am History of Content Moderation
Today, controversies seem to erupt daily over what kind of user platforms allow or don’t allow on their sites, but far less time is spent exploring why and how these moderation systems were initially put in place. This panel is a conversation with four of the people who worked on the ground to create and implement speech policies at three of the world's largest online content platforms: Facebook, YouTube, and Twitter.
Kate Klonick, Asst. Prof of Law St. John’s University Law School
Alexander Macgillivray, former General Counsel of Twitter
Micah Schaffer, former Policy Analyst and Community Manager at YouTube
Moderated by Steve Freeman of the Anti-Defamation League
11:00am-12:00 pm Facebook, Google, Twitter: Current Content Policy
With the historical backdrop of the previous panel, this panel explores the current policies around content moderation at Twitter, Facebook, & Google.
Peter Stern, Product Policy Stakeholder Engagement at Facebook
Nora Puckett, Senior Litigation Counsel at Google
Jerrel Peterson, Trust and Safety at Twitter
Moderated by Eric Goldman, Professor of Law at Santa Clara University
12:00-1:15 pm Lunch
1:15-2:30 pm Keynote Panel: Content Moderation, the Press, & the First Amendment
From Alex Jones removal across platforms to President Trump’s calls to end conservative bias in moderation policies, political news and political reporting seem bound up in platform content moderation. This panel explores the past and future impact of platform content moderation policies on the press, speech and democratic society
• Ben Smith, Editor of Buzzfeed
• Josh Marshall, Founder, Publisher, and Editor of Talking Points Memo
• Moderated by Jack Balkin, Professor at Yale Law School
2:45-3:45 pm The Effect of Being Banned: Content Moderation and Reporting
You don’t have to be Alex Jones to get blocked from a platforms, many reporters have had their content taken down or removed on the site because it has been found to violate platform policies. This panel contemplates the effects of this kind of takedown on not only the press, but on the people and events of the stories they publish.
Emily Bell, founding director at Tow Center for Digital Journalism at Columbia’s Graduate School of Journalism
Casey Newton, Senior Editor at The Verge
Surya Mattu, Data Reporter at The Markup
Moderated by Nabiha Syed, Deputy General Counsel at Buzzfeed
4:00-5:00 pm Simulation: You Be the Moderator
The discussion around setting content moderation policy can be very abstract, this final simulation with audience participation attempts to re-create some of the practical struggles faced by platform moderators and policy makes in trying to create global speech standards that operate transnationally.
• Moderated by Center for Democracy and Technology
5:00 pm Reception