After a year in which Facebook faced ample scrutiny for its handling of various controversies (culminating in founder Mark Zuckerberg giving a testimony to Congress), the company has publicly revealed its standards for moderating content. Although Facebook always had some publicly visible standards, the new standards show what factors the company weighs when deciding whether to remove content from the site. You can read the full standards here, but if scrolling through several pages of text seems like a bit much, we’ve got the highlights.
Facebook claims its standards are built around three pillars: Safety (removing content that harms others), Voice (the ability for users to express diverse views and ideas), and Equity (applying the same standards to all users). In regards to that last point, Facebook allows for some relaxation of its standards depending on the context.
One of the broad categories of content that Facebook might remove is the kind that encourages violence or criminal behavior. Credible violence can include content or accounts that Facebook believes pose a realistic threat. The social network also forbids users associated with terrorist or hate groups, human trafficking, or other forms of organized crime. Finally, Facebook also cracks down on users buying, selling, or trading “regulated goods” — i.e., drugs or firearms — on the social network.
Under a category that it terms “Safety,” Facebook’s standards maintain that the company will remove posts that encourage self-harm or suicide, as well as posts that involve sexual exploitation (of children or adults). According to the standards, Facebook defaults to “removing sexual imagery to prevent the sharing of non-consensual or underage content,” though, the company also understands that nudity sometimes serves a public interest, such as in protests or artwork.
On that topic, Facebook might remove content deemed “objectionable,” which includes the aforementioned sexual activity, as well as graphic violence and hate speech, which Facebook defines as “a direct attack on people based on what we call protected characteristics — race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, and serious disability or disease.”
Facebook’s community standards also include a section on “integrity and authenticity,” which covers a few things. Spam is verboten, as is fake news, especially after the public drubbing Facebook took after purveyors of fake news made use of the network during the 2016 presidential election. The community standards also emphasize that users shouldn’t misrepresent themselves on Facebook; you’re supposed to use your real name, and if you should happen to die, your relatives can have your account converted into a memorial, so nobody is making it seem like you’re posting from beyond the grave.
Finally, Facebook’s rules state that while users “own” and control any of the stuff they post on the site, they need to respect intellectual property laws. Basically, make sure you don’t infringe on any trademarks or copyrights!