- Facebook released its Community Standards to the public on Tuesday.
- These are rules that users need to adhere by if they don't want to be banned.
- Facebook also discusses requests it will honor if users don't want their accounts or want to remove specific accounts they don't own (such as those of deceased immediate family members.)
Facebook released its "Community Standards" on Tuesday, a list of official rules that outlines the types of posts that can get you banned from using Facebook. It also outlines the types of users it doesn't allow to post.
I dug through Facebook's guidelines to understand what's allowed and what isn't.
Facebook breaks down the types of unacceptable posts and content into six different categories, including: "Violence and Criminal Behavior," "Safety," "Objectionable Content," "Integrity and Authenticity," "Respecting Intellectual Property," and "Content-Related Requests."
This is what is and what isn't allowed:
Facebook bans all threats and calls to violence and says it works with its team to determine the difference between "casual statements" and "content that constitutes a credible threat to public or personal safety."
Facebook blocks (and is working to continue to block):
- Terrorist activity
- Organized hate
- Mass or serial murder
- Human trafficking
- Organized violence or criminal activity.
- Regulated goods.
Facebook CEO Mark Zuckerberg was specifically called out on Capitol Hill for allowing ads for regulated items such as opioids to run on Facebook, but the company's terms prohibit individual sales and trade of drugs, non-medical drugs, firearms, ammunition and more. It says it allows discussion of some of these topics, such as firearms.
Other banned topics include anything (or anyone) who's promoting or publicizing crime or trying to coordinate harm.
In its safety section of Community Guidelines, Facebook says it will "remove content, disable accounts, and work with law enforcement when we believe there is a genuine risk of physical harm or direct threats to public safety." This includes suicide and self-injury posts.
Facebook also bans:
- Child nudity and sexual exploitation of children (such as nude images, even if they're posted with "good intentions.")
- Images of sexual violence.
- Bullying that "purposefully targets private individuals with the intention of degrading or shaming them."
- Private information that could cause someone physical or financial harm.
Of note: Facebook says its "bullying policies do not apply to public figures because we want to allow discourse, which often includes critical discussion of people who are featured in the news or who have a large public audience." Facebook will remove content about public figures if it's considered hate speech or a threat, however.
Objectional content includes some of the topics in other categories and more. Here's what's banned:
- Hate speech.
- Graphic violence that "glorifies violence or celebrates the suffering or humiliation of others." It does allow graphic violence in cases that "raise awareness about issues," however, and in such cases Facebook places a warning about graphic content and requires the viewer to be over 18 years of age or older.
- Adult nudity and sexual activity (it allows nudity if it's to raise awareness, for educational or medical reasons, however.) Pictures of art (paintings and sculptures that depict nudity) are allowed.
- Content that's deemed "cruel and insensitive" including "content that targets victims of serious physical or emotional harm."
In this section, Facebook discusses the type of content that falls outside of its other categories. This includes:
- Spam ("misleading or inaccurate information to collect likes, followers, or shares.")
- Misrepresentation - You need to be a real, verifiable identity on Facebook.
- "False news" - Facebook says it tries to reduce "false news" but that satire is allowed. " For these reasons, we don't remove false news from Facebook but instead, significantly reduce its distribution by showing it lower in the News Feed."
Facebook includes "memorialization" in this category. This means you can memorialize an account of someone who passed away.
Facebook says it does not allow someone to post content that's owned by someone else, including anything with "copyrights, trademarks, and other legal rights."
It also says that you own everything you post. That means if you post a photograph you took, for example. you still own it and Facebook doesn't claim rights to it.
Here Facebook lays out what it can do to help various users.
It will oblige if someone asks to remove their account, remove a deceased immediate family members account or to remove "an incapacitated user's account" so long as an authorized representative makes the request.
Facebook also adds that it protects minors using the social network. It will oblige requests for:
- Removal of an underage accounts (you need to be at least 13 years old to use Facebook).
- "Government requests for removal of child abuse imagery depicting, for example, beating by an adult or strangling or suffocating by an adult."
- "Legal guardian requests for removal of attacks on unintentionally famous minors."
You can learn more about this in great more detail at by visiting Facebook's Community Standards page.