Leaked documents show how Facebook censors users, and reveal policies on sex, terrorism and hate

<strong>Image credit:</strong> <a href="https://www.shutterstock.com/g/AlesiaKan" target="_blank" rel="nofollow">AlesiaKan</a> / <a href="https://www.shutterstock.com/" target="_blank" rel="nofollow">Shutterstock</a>

At the moment, Facebook appears to be more concerned with keeping fake news out of users' news feeds and clamping down on clickbait and propaganda, but there has also been a lot of interest recently in how the social network moderates -- or censors -- content posted by its users. A leak of what has been dubbed The Facebook Files gives a fascinating insight into how the company moderates content, shedding light on just what its secret internal guidelines are.

Some of the rules are surprising -- livestream of self-harm, for instance, will not be censored, still images of animal abuse are fine -- others less so -- promotion of terrorism and terrorist groups is not permitted, ditto direct threats to someone's life (although the wording of the threat is important). With nearly 2 billion users, Facebook employs an incredibly small team of moderators, and the leaked documents show they have a very tough time.

Published by the Guardian, The Facebook Files are a series of manuals given to moderators after two weeks' training with the social network. The moderators supplement Facebook's automated AI/algorithm-driven systems that try to identify content that is not permitted before it is even published. The team of humans exists to handle the reports about content which are sent in by users -- everything from child abuse, cannibalism and racism, to sexual content, threats and terrorism.

Despite Facebook's continued protestations that it is not a publisher, its moderation documents bear a marked similarity to those that would be used by a publisher. They are also likely to cause confusion and even offense as users see for the first time just how -- and why -- some of their content is being filtered and censored. There is also likely to be heated and lengthy debate about the logic and reasoning behind some of Facebook's rules.

The site acknowledges that in the course of heated discussions and when talking about emotive topics, people will use strong language to express their feelings. As such, an explosion of "fuck off and die" would be permitted. However, a direct threat such as "Trump should be shot" would be deleted due to the protected nature of heads of state.

It's ok to share still photographs of animal abuse as they can -- Facebook says -- be used to raise awareness of an issue. A video of the same abuse, however, is not allowed. There have been numerous incidents recently of Facebook Live being used to stream all manner of potentially upsetting content. Facebook says, though, that should someone choose to livestream themselves self-harming, moderators will not stop the stream. The idea is that the company "doesn’t want to censor or punish people in distress" and the livestream itself would be seen as an opportunity for help to be offered. The same even applies to suicide attempts.

Other items from the "banned" list include Holocaust denial, threats of violence against public figures, sexual violence, promotion of terrorism and extremism. On the "permitted" list are things such as nudity in art, criticism of terrorism and extremism, bullying and abuse of children if there is not sexual or sadistic element, videos of violent deaths that could be used to raised awareness of issues.

Take a look at the Guardian website for more detail, including Facebook's guidance on graphic violence.

Image credit: AlesiaKan / Shutterstock

2 Responses to Leaked documents show how Facebook censors users, and reveal policies on sex, terrorism and hate

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.