- In a 46-page document, Facebook outlined the powers and limitations of the board, stating that the company is committing to fund the trust that will support the board for six years.
- The board will have the ability to review content moderation cases so long as they do not involve content posted on Facebook's marketplace, fundraisers, Facebook dating features or the Facebook-owned WhatsApp, Messenger, Instagram Direct or Oculus services.
- Facebook said that the board's decisions will not necessarily create precedents for the social network.
Facebook on Tuesday unveiled its proposed bylaws for the company's oversight board — a sort of "supreme court" that can theoretically overturn content moderation decisions — but it's filled with loopholes and binds Facebook to very little concrete action.
Facebook CEO Mark Zuckerberg first announced the oversight board in November 2018 as the company weathered numerous scandals and criticism regarding its privacy and content moderation practices. During the run-up to the 2016 election, for instance, conservatives criticized the company for de-emphasizing certain news sources in its "trending" news section.
In a blog post, Zuckerberg noted that Facebook has to make many decisions every day over whether to remove content that violates its policies on nudity, misinformation and other areas, but wrote that Facebook "should not make so many important decisions about free expression and safety on our own."
The solution was supposed to be an independent board that would make final, binding decisions on whether to reinstate removed content, and explain its reasons for doing so.
In a 46-page document released Tuesday, Facebook outlined the bylaws of that board, establishing its powers and its broad limitations. These bylaws make clear that Facebook is still firmly in control.
Article 2: Section 1.2.1 -- The following types of content are not available for the board's review, unless reassessed in the future by Facebook:
Content types: content posted through marketplace, fundraisers, Facebook dating, messages, and spam.
Decision types: decisions made on reports involving intellectual property or pursuant to legal obligations.
Services: content on WhatsApp, Messenger, Instagram Direct, and Oculus.
This means the oversight board will be extremely limited in what pieces of content it will actually address, leaving out major pieces of the Facebook kingdom.
Article 2: Section 1.3.1 -- Facebook will fund the trust upfront for at least six (6) years. It will review the annual reports prepared by the trust to determine the operational and procedural effectiveness of the board.
In other words, Facebook can just let the board die after six years.
Article 2: Section 2.3.1 -- Facebook will implement board decisions to allow or remove the content properly brought to it for review within seven (7) days of the release of the board's decision on how to action the content. In addition, Facebook will undertake a review to determine if there is identical content with parallel context associated with the board's decision that remains on Facebook. If Facebook determines that it has the technical and operational capacity to take action on that content as well, it will do so promptly.
This means that the decisions made by the oversight board will, by default, apply narrowly to the specific piece of content that is being reviewed, and will not create any precedents that Facebook has to follow in the future for similar types of violations. The company retains final say on whether or not to broadly apply the decisions of the board.
Article 5: Section 1 -- These bylaws may be amended only with the approval of a majority of the individual trustees and with the agreement of Facebook and a majority of the board.
This means the board's members, trustees and Facebook have the ability to amend the bylaws as they please. Some bylaws require a two-thirds vote to be changed, but the document allows for quite a bit of editing.
In some cases where the board and trustees vote to amend the bylaws in ways that could be unfavorable to Facebook, the bylaws state Facebook can agree to the changes "where it has determined that it is technically, operationally, and legally capable of doing so."
In other words, Facebook may not be acting alone when it comes to removing content, but it's still firmly in control of its platform.
The company also announced that Thomas Hughes, former executive director of human rights organization Article 19, will serve as the director of the oversight board administration. Facebook said it plans to announce board members and trustees in the coming months.