Today we're releasing a draft charter giving more detail about a content oversight board.
We welcome debate about how to help keep Facebook a safe place where people can express their ideas. But that debate should be based on facts, not mischaracterizations.
EFF and more than 100 civil society organizations across the globe wrote directly to Mark Zuckerberg recently demanding greater transparency and accountability for Facebook content moderation practices. A key step, we told Facebook, is implementation of a robust appeals process giving all users the...
70 of the world's leading human rights groups ask Mark Zuckerberg to create due process for censored content
There’s a lot of talk these days about “content moderation.” Policymakers, some public interest groups, and even some users are clamoring for intermediaries to do “more,” to make the Internet more “civil,” though there are wildly divergent views on what that “more” should be. Others vigorously...
Contractors ‘irreparably traumatized’ by having to witness child abuse, rape, torture, suicide and murder, says former employee
Content review at this size has never been done before. We recognize the enormity of this challenge and the responsibility we have to get it right.
This week a TV report on Channel 4 in the UK has raised important questions about our policies and processes.
C4 Dispatches documentary finds moderators left Britain First’s pages alone as ‘they generate a lot of revenue’
Washington, D.C.—The Electronic Frontier Foundation (EFF) called on Facebook, Google, and other social media companies today to publicly report how many user posts they take down, provide users with detailed explanations about takedowns, and implement appeals policies to boost accountability....
For some time now, it’s been clear that one essential response to the flood of misinformation and other deceptive Internet tactics must be…
Logan Paul’s controversial dead body video is a watershed moment in YouTube’s effort to grapple with a vast content moderation problem.
Moderating user-generated content is hard: it is easier, though, with a realistic understanding that the Internet reflects humanity — it is capable of both good and evil.
A trove of internal documents sheds light on the algorithms that Facebook’s censors use to differentiate between hate speech and legitimate political expression.
Reviewing online material on a global scale is challenging and essential.
10 guidelines which we expect all participants in the Guardian's community areas to abide by
The social network has pledged to work harder to identify and remove disturbing content – but doing so can take a psychological toll