How Facebook decides what violent and explicit content is allowed

Adjust Comment Print

Facebook will allow livestreams of self-harming for a time, as the company "doesn't want to censor or punish people in distress who are attempting suicide", but it will be deleted once there is "no longer an opportunity to help the person".

A variety of reports about Facebook's rules and policies over content moderation have just been published, offering an in-depth look at how the company deals with content shared on its site and how it determines what users can post.

The leaked rules land not only at the height of election season in the United Kingdom, but also follow politicians of all stripes attacking Google, Twitter, and Facebook for failing to effectively police the content posted on their ad-stuffed services.

As Mashable put it, "Facebook is making it up as they go along and we're the collateral damage", which is a sentiment that more and more people are starting to share and reflects badly on the social network.

In a similar stance, videos uploaded or recorded live showing violent deaths will likely remain in place, as Facebook believes it will help raise awareness of people with mental illness.

The popular social media network aims to allow "as much speech as possible" but says it does draw the line at "content that could credibly cause real world harm".

Aside from footage of actual violence, Facebook must also decide how to respond to threats of it - what they call "credible threats of violence". It specifies that "aspirational violence" like threatening to "beat up fat kids" is still okay, versus, say, writing "someone shoot Trump".

Not all videos of violent deaths have to be deleted.

Everton FC to make Ross Barkley statement tomorrow
Supporters will be anxious to add to an exciting group of players rather than lose their stars if they are to progress next term. Everton manager Ronald Koeman says he is not confident of keeping Ross Barkley at Goodison Park this summer.

Threats against so-called "protected categories" such as President Trump should be deleted according to the publication's files - and so "Someone shoot Trump" is not acceptable to post.

Facebook will allow some "newsworthy exceptions" to photographic nudity as well as "hand-made" nude art, but other photos and digital art are off-limits. Many moderators have said they find the policies inconsistent and confusing, like, for example, how not all rape threats are treated equally. The site utilizes some automated systems to proactively eliminate content, such as child sexual abuse or terrorism, but what's left falls to teams of moderators.

It's a bit depressing how hard it is to tease out the rationale behind these guidelines: "I'm going to kill you" is not a credible threat because it's abstract, but they very specific "unless you stop bitching I'll have to cut your tongue out" still works.

'In addition to investing in more people, we're also building better tools to keep our community safe.

Animal abuse is allowed, but may need to be classified as "disturbing".

Facebook's policies on sexual content are deemed the most complex and confusing, The Guardian reports.

Images of child abuse are removed if it's shared with "sadism and celebration".

- Some extremely disturbing imagery may be "marked as disturbing."