Facebook Has Revealed the Hyper-Specific Internal Rules It Uses to Police Content

Chinnapong/Shutterstock.com

The company is publishing updated “Community Standards” that spell out for users exactly what they are allowed to post, and what is forbidden.

Deciding what content is allowed on Facebook and what is not is one of the company’s biggest headaches. The platform is constantly being criticized over either censoring content or leaving up harmful material. After every backlash it has to clarify, for example, in which cases it bans nudity (generally, female nipples) and when it makes exceptions (breastfeeding, artwork).

Now, the company is publishing updated “Community Standards” that spell out for users exactly what they are allowed to post, and what is forbidden. To date, these standards haven been vague and not particularly comprehensive. The new document takes up 27 printed pages, and is at times incredibly granular. Occasionally, the level of detail generates further questions, or sounds downright bizarre.

The standards “closely mirror” internal guidelines that Facebook’s content moderators, 7,500 people in offices all around the world, use to screen posts. The embattled company is making them available to the public in an effort to be more transparent about its policies. Monika Bickert, VP of Global Policy Management, says in a post that there are two reasons for the move:

First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time.

The experts she mentions include herself (a former federal prosecutor), other members of the Facebook team who include a former rape counselor and human rights lawyer, and outside voices such as Timothy Garton Ash, an Oxford University historian. In the announcement, Bickert makes sure to admit that Facebook’s policies and moderation practices are not perfect, and that the company makes mistakes, but that the process is evolving. “We make mistakes because our processes involve people, and people are fallible,” she says.

Content reviewers are audited on a weekly basis, to limit mistakes and bias.

In addition to revealing the guidelines content moderators follow, the company said it is introducing an appeals process for individual posts. If a post is determined to have violated Facebook’s rules, it will be taken down, but the user will get an option to appeal. Until now, users could only appeal if their accounts were disabled, or if violations had to do with copyright infringement. The appeals process won’t be rolled out for all content immediately: it will start for posts flagged for nudity or sex, hate speech, and graphic violence.

Most of the guidelines are understandable, and it’s possible to discern where the details come from. For example, Facebook tells us that it considers a serial murderer (who is not allowed on the platform) “any individual who has committed 2 or more murders over multiple incidents or locations.” A mass murder, meanwhile, is considered to be four deaths in one incident, which roughly translates to the FBI’s distinction between the two types of crimes.

Other rules were clearly developed in response to a backlash Facebook received in the past. The platform bans the targeting of victims or survivors of violent tragedies with claims that a person is “lying about being a victim of a event” or “acting/pretending to be a victim of a event,” referring to a common trope used by trolls, most famously after the Parkland shooting.

Facebook has also been repeatedly criticized over its policies on nudity, particularly over censoring female breasts. The new guidelines say:

Our nudity policies have become more nuanced over time. We understand that nudity can be shared for a variety of reasons, including as a form of protest, to raise awareness about a cause, or for educational or medical reasons. Where such intent is clear, we make allowances for the content. For example, while we restrict some images of female breasts that include the nipple, we allow other images, including those depicting acts of protest, women actively engaged in breast-feeding, and photos of post-mastectomy scarring. We also allow photographs of paintings, sculptures, and other art that depicts nude figures.

In other cases, the guidelines are somewhat puzzling. “Uncovered female nipples for children older than toddler-age” is not allowed in posts on the platform. But where is the cut-off for a toddler?

The company says it provides “some protections for immigration status” but also that it allows “criticism of immigration policies and arguments for restricting those policies,” which could be a difficult line to draw.

In the “suicide and self-injury section,” the company says it provides resources for to people who post content that mentions suicide or self mutilation, as well as “images where more than one cut of self mutilation is present on a body part and the primary subject of the image is one or more unhealed cuts.”

Perhaps the strangest prohibition, in the section about bullying, illustrates the challenge that the company faces when trying to police so many different kinds of content. It won’t allow “comparison to animals that are culturally perceived as intellectually or physically inferior or to an inanimate object,” raising more questions than it answers, which will make it particularly difficult to enforce.