The question of what should be moderated, and when, is an increasingly frequent one in tech. There is no bright line, but there are ways to get closer to an answer.
Facebook’s independent oversight board will have a final charter by August. It’s messy, but so is every constitutional convention
The Data Transparency Advisory Group just released an independent assessment of whether the metrics we share in our enforcement report are accurate and meaningful.
Overal ter wereld maken overheden ad hoc regels om de verspreiding van haat en desinformatie tegen te gaan. Die wetgeving zal niet slagen zolang bedrijven als Facebook en YouTube hun luiken gesloten houden en hun verantwoordelijkheid kunnen blijven afschuiven.
The C.E.O. of Twitter no longer seems capable of controlling the system he’s created.
Site says recently live streams are prioritised for review only when flagged for suicide
YouTube zet bij video’s met minderjarigen de reactiemogelijkheid uit, omdat er vaak seksueel getinte opmerkingen onder de filmpjes verschenen. Het gigantische videoplatform worstelt al jaren met extreem gedrag van makers en gebruikers, maar modereren is op deze schaal bijna niet te doen.
Today we're releasing a draft charter giving more detail about a content oversight board.
We welcome debate about how to help keep Facebook a safe place where people can express their ideas. But that debate should be based on facts, not mischaracterizations.
EFF and more than 100 civil society organizations across the globe wrote directly to Mark Zuckerberg recently demanding greater transparency and accountability for Facebook content moderation practices. A key step, we told Facebook, is implementation of a robust appeals process giving all users the...
70 of the world's leading human rights groups ask Mark Zuckerberg to create due process for censored content
There’s a lot of talk these days about “content moderation.” Policymakers, some public interest groups, and even some users are clamoring for intermediaries to do “more,” to make the Internet more “civil,” though there are wildly divergent views on what that “more” should be. Others vigorously...
Contractors ‘irreparably traumatized’ by having to witness child abuse, rape, torture, suicide and murder, says former employee
Content review at this size has never been done before. We recognize the enormity of this challenge and the responsibility we have to get it right.
This week a TV report on Channel 4 in the UK has raised important questions about our policies and processes.
C4 Dispatches documentary finds moderators left Britain First’s pages alone as ‘they generate a lot of revenue’
Washington, D.C.—The Electronic Frontier Foundation (EFF) called on Facebook, Google, and other social media companies today to publicly report how many user posts they take down, provide users with detailed explanations about takedowns, and implement appeals policies to boost accountability....
For some time now, it’s been clear that one essential response to the flood of misinformation and other deceptive Internet tactics must be…
Logan Paul’s controversial dead body video is a watershed moment in YouTube’s effort to grapple with a vast content moderation problem.
Moderating user-generated content is hard: it is easier, though, with a realistic understanding that the Internet reflects humanity — it is capable of both good and evil.
A trove of internal documents sheds light on the algorithms that Facebook’s censors use to differentiate between hate speech and legitimate political expression.
Reviewing online material on a global scale is challenging and essential.
10 guidelines which we expect all participants in the Guardian's community areas to abide by
The social network has pledged to work harder to identify and remove disturbing content – but doing so can take a psychological toll