Facebook has published a detailed update of its “community standards,” which outline what kinds of content may be shared or blocked on the social networking site, to explicitly define its notions of offensive and harmful images and language. In a blog post on Sunday, company executives wrote: ”While our policies and standards themselves are not changing, we have heard from people that it would be helpful to provide more clarity and examples… For example, what exactly do we mean by nudity, or what do we mean by hate speech?”
Here’s exactly what Facebook means when it says, “We restrict the display of nudity”:
We remove photographs of people displaying genitals or focusing in on fully exposed buttocks. We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring.
Acknowledging that “people sometimes share content containing nudity for reasons like awareness campaigns or artistic projects,” Facebook does allow ”photographs of paintings, sculptures, and other art that depicts nude figures.” Images of sexual intercourse and vivid descriptions of sexual acts are not allowed.
The elucidated guidance comes in the wake of new content-sharing rules at Twitter and Reddit, where users sometimes post sensitive photos and strongly worded language just as they do on Facebook, which had 1.4 billion monthly active users at last count. On Twitter, posting private information (including photographs) about another person without his or her consent is now forbidden; Reddit has banned the posting of sexually explicit photos and videos without consent of the individuals depicted.
Facebook’s policies encompass much more than personal privacy issues and consensual sharing of nude photos or sex tapes.
For example, any content posted “with the intention of degrading or shaming” individuals may be removed—including most forms of hate speech. Exceptions may be made for ”content containing someone else’s hate speech for the purpose of raising awareness or educating others about that hate speech.”
Facebook places a significant emphasis on context when deciding whether or not to remove a post or to censor it according to geography—for example, when sharing an image violates a specific country’s laws, the image may not be removed from the site entirely but will be blocked for users in that country. Each decision about removing or blocking content, or suspending a user’s account, is made by a real person, which sacrifices speed for the sake of (hopefully) taking contextual clues into account. Facebook continues to insist that user accounts be associated with authentic names and identities—though it has loosened this controversial policy slightly, from requiring “legal” names to “authentic” ones.
In addition to updating its community standards, Facebook released a report detailing requests it received from governments around the world during the second half of 2014. Facebook documented an 11% rise, overall, in governmental requests to censor content, and a much smaller rise in governmental requests for account data.